28 resultados para hard-to-reach populations

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present ACACIA, an agent-based program implemented in Java StarLogo 2.0 that simulates a two-dimensional microworld populated by agents, obstacles and goals. Our program simulates how agents can reach long-term goals by following sensorial-motor couplings (SMCs) that control how the agents interact with their environment and other agents through a process of local categorization. Thus, while acting in accordance with this set of SMCs, the agents reach their goals through the emergence of global behaviors. This agent-based simulation program would allow us to understand some psychological processes such as planning behavior from the point of view that the complexity of these processes is the result of agent-environment interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the Model-based Systems and Qualitative Reasoning Group (Technical University of Munich), from September until December 2005. Constructed wetlands (CWs), or modified natural wetlands, are used all over the world as wastewater treatment systems for small communities because they can provide high treatment efficiency with low energy consumption and low construction, operation and maintenance costs. Their treatment process is very complex because it includes physical, chemical and biological mechanisms like microorganism oxidation, microorganism reduction, filtration, sedimentation and chemical precipitation. Besides, these processes can be influenced by different factors. In order to guarantee the performance of CWs, an operation and maintenance program must be defined for each Wastewater Treatment Plant (WWTP). The main objective of this project is to provide a computer support to the definition of the most appropriate operation and maintenance protocols to guarantee the correct performance of CWs. To reach them, the definition of models which represent the knowledge about CW has been proposed: components involved in the sanitation process, relation among these units and processes to remove pollutants. Horizontal Subsurface Flow CWs are chosen as a case study and the filtration process is selected as first modelling-process application. However, the goal is to represent the process knowledge in such a way that it can be reused for other types of WWTP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spread of milk consumption was a significant change in the diet of Europeans, however it is one that has not been greatly studied with regard to the populations of Mediterranean Europe. In this article we shall analyse the ain circumstances that conditioned that process in Catalonia between the middle of the 19th century and 1936. In our study we shall argue that the consumption of milk in this area was only relevant in the 19th century in situations of illness or old age, and that it subsequently increased and acquired a new significance as a result of various factors. In particular, we shall emphasise: (a) the scientific advances in microbiology and nutrition, (b) the activities carried out by doctors and various public institutions to promote the consumption of fresh milk, and (c) the technological innovations in the milk producing sector. In Appendix 1 we show two maps representing the main territorial references that we shall mention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Searching for associations between genetic variants and complex diseases has been a very active area of research for over two decades. More than 51,000 potential associations have been studied and published, a figure that keeps increasing, especially with the recent explosion of array-based Genome-Wide Association Studies. Even if the number of true associations described so far is high, many of the putative risk variants detected so far have failed to be consistently replicated and are widely considered false positives. Here, we focus on the world-wide patterns of replicability of published association studies.Results: We report three main findings. First, contrary to previous results, genes associated to complex diseases present lower degrees of genetic differentiation among human populations than average genome-wide levels. Second, also contrary to previous results, the differences in replicability of disease associated-loci between Europeans and East Asians are highly correlated with genetic differentiation between these populations. Finally, highly replicated genes present increased levels of high-frequency derived alleles in European and Asian populations when compared to African populations. Conclusions: Our findings highlight the heterogeneous nature of the genetic etiology of complex disease, confirm the importance of the recent evolutionary history of our species in current patterns of disease susceptibility and could cast doubts on the status as false positives of some associations that have failed to replicate across populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper attempts to give an account of the syntax of quotation from an LFG perspective. I claim that quotes are inserted at N’ positions by making use of a special phrase structure rule that makes the quote’s f-structure the PRED value of the mother f-structure. However, in order to reach to this conclusion, the concept of quotation has to be restricted to only include metalinguistic and direct reported speech quotes, by making use of the property of grammatical opacity, i.e. only subsegments whose ungrammaticality does not affect the grammaticality of the whole sentence are quotes. The main advantage of this is this distinguishes the syntax of direct quotes from the one of other citational but not quotational structures like Davidson’s (1979) and Cappelen and Lepore’s (2007) mixed quotation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well accepted that people resist evidence that contradicts their beliefs.Moreover, despite their training, many scientists reject results that are inconsistent withtheir theories. This phenomenon is discussed in relation to the field of judgment anddecision making by describing four case studies. These concern findings that clinical judgment is less predictive than actuarial models; simple methods have proven superiorto more theoretically correct methods in times series forecasting; equal weighting ofvariables is often more accurate than using differential weights; and decisions cansometimes be improved by discarding relevant information. All findings relate to theapparently difficult-to-accept idea that simple models can predict complex phenomenabetter than complex ones. It is true that there is a scientific market place for ideas.However, like its economic counterpart, it is subject to inefficiencies (e.g., thinness,asymmetric information, and speculative bubbles). Unfortunately, the market is only correct in the long-run. The road to enlightenment is bumpy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reinforcement learning (RL) method was used to train a virtual character to move participants to a specified location. The virtual environment depicted an alleyway displayed through a wide field-of-view head-tracked stereo head-mounted display. Based on proxemics theory, we predicted that when the character approached within a personal or intimate distance to the participants, they would be inclined to move backwards out of the way. We carried out a between-groups experiment with 30 female participants, with 10 assigned arbitrarily to each of the following three groups: In the Intimate condition the character could approach within 0.38m and in the Social condition no nearer than 1.2m. In the Random condition the actions of the virtual character were chosen randomly from among the same set as in the RL method, and the virtual character could approach within 0.38m. The experiment continued in each case until the participant either reached the target or 7 minutes had elapsed. The distributions of the times taken to reach the target showed significant differences between the three groups, with 9 out of 10 in the Intimate condition reaching the target significantly faster than the 6 out of 10 who reached the target in the Social condition. Only 1 out of 10 in the Random condition reached the target. The experiment is an example of applied presence theory: we rely on the many findings that people tend to respond realistically in immersive virtual environments, and use this to get people to achieve a task of which they had been unaware. This method opens up the door for many such applications where the virtual environment adapts to the responses of the human participants with the aim of achieving particular goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reinforcement learning (RL) method was used to train a virtual character to move participants to a specified location. The virtual environment depicted an alleyway displayed through a wide field-of-view head-tracked stereo head-mounted display. Based on proxemics theory, we predicted that when the character approached within a personal or intimate distance to the participants, they would be inclined to move backwards out of the way. We carried out a between-groups experiment with 30 female participants, with 10 assigned arbitrarily to each of the following three groups: In the Intimate condition the character could approach within 0.38m and in the Social condition no nearer than 1.2m. In the Random condition the actions of the virtual character were chosen randomly from among the same set as in the RL method, and the virtual character could approach within 0.38m. The experiment continued in each case until the participant either reached the target or 7 minutes had elapsed. The distributions of the times taken to reach the target showed significant differences between the three groups, with 9 out of 10 in the Intimate condition reaching the target significantly faster than the 6 out of 10 who reached the target in the Social condition. Only 1 out of 10 in the Random condition reached the target. The experiment is an example of applied presence theory: we rely on the many findings that people tend to respond realistically in immersive virtual environments, and use this to get people to achieve a task of which they had been unaware. This method opens up the door for many such applications where the virtual environment adapts to the responses of the human participants with the aim of achieving particular goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sudoku problems are some of the most known and enjoyed pastimes, with a never diminishing popularity, but, for the last few years those problems have gone from an entertainment to an interesting research area, a twofold interesting area, in fact. On the one side Sudoku problems, being a variant of Gerechte Designs and Latin Squares, are being actively used for experimental design, as in [8, 44, 39, 9]. On the other hand, Sudoku problems, as simple as they seem, are really hard structured combinatorial search problems, and thanks to their characteristics and behavior, they can be used as benchmark problems for refining and testing solving algorithms and approaches. Also, thanks to their high inner structure, their study can contribute more than studies of random problems to our goal of solving real-world problems and applications and understanding problem characteristics that make them hard to solve. In this work we use two techniques for solving and modeling Sudoku problems, namely, Constraint Satisfaction Problem (CSP) and Satisfiability Problem (SAT) approaches. To this effect we define the Generalized Sudoku Problem (GSP), where regions can be of rectangular shape, problems can be of any order, and solution existence is not guaranteed. With respect to the worst-case complexity, we prove that GSP with block regions of m rows and n columns with m = n is NP-complete. For studying the empirical hardness of GSP, we define a series of instance generators, that differ in the balancing level they guarantee between the constraints of the problem, by finely controlling how the holes are distributed in the cells of the GSP. Experimentally, we show that the more balanced are the constraints, the higher the complexity of solving the GSP instances, and that GSP is harder than the Quasigroup Completion Problem (QCP), a problem generalized by GSP. Finally, we provide a study of the correlation between backbone variables – variables with the same value in all the solutions of an instance– and hardness of GSP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To study whether inversions (or arrangements) by themselves or karyotypes are the main global warming adaptive target of natural selection, two Drosophila subobscura Serbian populations (Apatin and Petnica) were re analyzed using different statistical approaches. Both populations were sampled in an approximately 15 years period: Apatin in 1994 and 2008 + 2009 and Petnica in 1995 and 2010. For all chromosomes, the four collections studied were in Hardy-Weinberg equilibrium. Thus, it seems that inversions (or arrangements) combined at random to constitute populations" karyotypes. However, there were differences in karyotypic fre quencies along the years, although they were significant only for Apatin population. It is possible to conclude that inversions (or arrangements) are likely the target of natural selection, because they presented long term changes, but combine at random to generate the corresponding karyotypic combinations. As a consequence, the frequencies of karyotypes also change along time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[spa] El artículo se centra en la relevancia del conocimiento práctico para el aprendizaje del Derecho. Las facultades de Derecho en España padecen de una falta de programas específicos de formación en sus planes de estudio. Un análisis histórico muestra que el conocimiento práctico ha sufrido un fuerte retroceso en favor de un enfoque teórico siguiendo la pauta marcada por Alexander von Humboldt. El sistema español de provisión de los cargos públicos basado en unas oposiciones memorísticas con unos extensos temarios con centenares de conceptos refuerza este proceso y se convierte en un obstáculo para la renovación de los estudios y de la práctica jurídica. Como resultado de todo ello los estudiantes carecen de habilidades para el desarrollo profesional y el aprendizaje a lo largo de la vida.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Budget forecasts have become increasingly important as a tool of fiscal management to influence expectations of bond markets and the public at large. The inherent difficulty in projecting macroeconomic variables – together with political bias – thwart the accuracy of budget forecasts. We improve accuracy by combining the forecasts of both private and public agencies for Italy over the period 1993-2012. A weighted combined forecast of the deficit/ ratio is superior to any single forecast. Deficits are hard to predict due to shifting economic conditions and political events. We test and compare predictive accuracy over time and although a weighted combined forecast is robust to breaks, there is no significant improvement over a simple RW model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Budget forecasts have become increasingly important as a tool of fiscal management to influence expectations of bond markets and the public at large. The inherent difficulty in projecting macroeconomic variables – together with political bias – thwart the accuracy of budget forecasts. We improve accuracy by combining the forecasts of both private and public agencies for Italy over the period 1993-2012. A weighted combined forecast of the deficit/ ratio is superior to any single forecast. Deficits are hard to predict due to shifting economic conditions and political events. We test and compare predictive accuracy over time and although a weighted combined forecast is robust to breaks, there is no significant improvement over a simple RW model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze situations in which a group of agents (and possibly a designer) have to reach a decision that will affect all the agents. Examples of such scenarios are the location of a nuclear reactor or the siting of a major sport event. To address the problem of reaching a decision, we propose a one-stage multi-bidding mechanism where agents compete for the project by submitting bids. All Nash equilibria of this mechanism are efficient. Moreover, the payoffs attained in equilibrium by the agents satisfy intuitively appealing lower bounds..