56 resultados para problem of mediation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

To study different temporal components on cancer mortality (age, period and cohort) methods of graphic representation were applied to Swiss mortality data from 1950 to 1984. Maps using continuous slopes ("contour maps") and based on eight tones of grey according to the absolute distribution of rates were used to represent the surfaces defined by the matrix of various age-specific rates. Further, progressively more complex regression surface equations were defined, on the basis of two independent variables (age/cohort) and a dependent one (each age-specific mortality rate). General patterns of trends in cancer mortality were thus identified, permitting definition of important cohort (e.g., upwards for lung and other tobacco-related neoplasms, or downwards for stomach) or period (e.g., downwards for intestines or thyroid cancers) effects, besides the major underlying age component. For most cancer sites, even the lower order (1st to 3rd) models utilised provided excellent fitting, allowing immediate identification of the residuals (e.g., high or low mortality points) as well as estimates of first-order interactions between the three factors, although the parameters of the main effects remained still undetermined. Thus, the method should be essentially used as summary guide to illustrate and understand the general patterns of age, period and cohort effects in (cancer) mortality, although they cannot conceptually solve the inherent problem of identifiability of the three components.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Social groups face a fundamental problem of overcoming selfish individuals capable of destroying cooperation. In the social amoeba Dictyostelium discoideum, there is evidence that some clones ('cheaters') contribute disproportionately to the viable spores in a fruiting body while avoiding the dead stalk cell fate. It remains unclear, however, whether this cheating is actually the product of selection. Here, I report the results of an experimental evolution study designed to test whether clones of D. discoideum will evolve resistance to cheating in the laboratory with genetic variation created only through spontaneous mutation. Two strains, one green fluorescent protein (GFP)-labelled and one wild-type, were allowed to grow and develop together before the wild-type strain was removed and replaced with a naïve strain evolving in parallel. Over the course of 10 social generations, the GFP-labelled strain reliably increased its representation in the spores relative to control populations that had never experienced the competitor. This competitive advantage extended to the non-social, vegetative growth portion of the life cycle, but not to pairwise competition with two other strains. These results indicate strong antagonism between strains, mediated by ample mutational variation for cheating and also suggest that arms races between strains in the wild may be common.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article reviews the literature that deals with the problem of legitimizing regulatory governance, with a special attention to the question of the accountability of independent regulatory agencies. The discussion begins with the presentation of the traditional arguments concerning the democratic deficit of the regulatory state. The positive evaluation of regulatory performance by citizens is presented as an alternative source of legitimacy. It follows the discussion of the existing approaches to make agencies accountable, so as to ensure the procedural legitimacy of regulatory governance. Some insights concerning new forms of accountability are offered in the last section, namely with reference to the establishment and ongoing consolidation of formal and informal networks of regulators.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fluvial deposits are a challenge for modelling flow in sub-surface reservoirs. Connectivity and continuity of permeable bodies have a major impact on fluid flow in porous media. Contemporary object-based and multipoint statistics methods face a problem of robust representation of connected structures. An alternative approach to model petrophysical properties is based on machine learning algorithm ? Support Vector Regression (SVR). Semi-supervised SVR is able to establish spatial connectivity taking into account the prior knowledge on natural similarities. SVR as a learning algorithm is robust to noise and captures dependencies from all available data. Semi-supervised SVR applied to a synthetic fluvial reservoir demonstrated robust results, which are well matched to the flow performance

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction. The management of large burn victims has significantly improved in the last decades. Specifically autologous cultured keratinocytes (CEA) overcame the problem of limited donor sites in severely burned patients. Several studies testing CEA's in their burn centers give mixed results on the general outcomes of burn patients. Methods. A review of publications with a minimum of 15 patients per study using CEA for the management of severe burn injury from 1989 until 2011 were recruited by using an online database including Medline, Pub Med and the archives of the medical library of the CHUV in Lausanne. Results. 18 studies with a total of 977 patients were included into this review. Most of the studies did not specify if CEA's were grafted alone or in combination with split thickness skin grafts (STSG) although most of the patients seemed to have received both methodologies in reviewed studies. The mean TBSA per study ranged from 33% to 78% in patients that were grafted with CEA's. Here no common minimum TBSA making a patient eligible for CEA grafting could be found. The definition of the "take rate" is not standardized and varied largely from 26% to 73%. Mortality and hospitalization time could not be shown to correlate with CEA use in all of the studies. As late complications, some authors described the fragility of the CEA regenerated skin. Conclusion. Since the healing of large burn victims demands for a variety of different surgical and non-surgical treatment strategies and the final outcome mainly depends on the burned surface as well as the general health condition of the patient, no definitive conclusion could be drawn from the use of CEA's of reviewed studies. From our own experience, we know that selected patients significantly profit from CEA grafts although cost efficiency or the reduction of mortality cannot be demonstrated on this particular cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Genital herpes is being recognised as a medical problem of increasing importance. Diagnosis and management are complex. The present recommendations have been established by a multidisciplinary panel of specialists and endorsed by all Swiss medical societies involved in the medical care of such patients. The aim is to improve the care of affected patients, to reduce horizontal and vertical transmission and to diminish the psychosocial burden.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One major methodological problem in analysis of sequence data is the determination of costs from which distances between sequences are derived. Although this problem is currently not optimally dealt with in the social sciences, it has some similarity with problems that have been solved in bioinformatics for three decades. In this article, the authors propose an optimization of substitution and deletion/insertion costs based on computational methods. The authors provide an empirical way of determining costs for cases, frequent in the social sciences, in which theory does not clearly promote one cost scheme over another. Using three distinct data sets, the authors tested the distances and cluster solutions produced by the new cost scheme in comparison with solutions based on cost schemes associated with other research strategies. The proposed method performs well compared with other cost-setting strategies, while it alleviates the justification problem of cost schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Estimation of the spatial statistics of subsurface velocity heterogeneity from surface-based geophysical reflection survey data is a problem of significant interest in seismic and ground-penetrating radar (GPR) research. A method to effectively address this problem has been recently presented, but our knowledge regarding the resolution of the estimated parameters is still inadequate. Here we examine this issue using an analytical approach that is based on the realistic assumption that the subsurface velocity structure can be characterized as a band-limited scale-invariant medium. Our work importantly confirms recent numerical findings that the inversion of seismic or GPR reflection data for the geostatistical properties of the probed subsurface region is sensitive to the aspect ratio of the velocity heterogeneity and to the decay of its power spectrum, but not to the individual values of the horizontal and vertical correlation lengths.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of how cooperation can evolve between individuals or entities with conflicting interests is central to biology as many of the major evolutionary transitions, from the first replicating molecules to human societies, have required solving this problem. There are many routes to cooperation but humans seem to be distinct from other species as they have more complex and diverse mechanisms, often due to their higher cognitive skills, allowing them to reap the benefits from living in groups. Among those mechanisms, the use of reputation or past experience with others as well as sanctioning mechanisms both seem to be of major importance. They have often been considered separately but the interaction between the two might provide new insights as to how punishment could have appeared as a means to enforce cooperation in early humans. In this thesis, I firstly use theoretical approaches from evolutionary game theory to investigate the evolution of punishment and cooperation through a reputation system based on punitive actions, and compare the efficacy of this system, in terms of cooperation achieved, with one based on cooperative actions. On the other hand, I use empirical approaches from economics to test, in real life, predictions from theoretical models but also to explore further conditions such as environmental variation, constrained memory, or even the scale of competition between individuals. Both approaches have allowed contributing to the understanding of how these factors affect reputation and punishment use, and ultimately how cooperation is achieved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.