161 resultados para Nature inspired algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé de Thèse Proche de Horkheimer, d'Adorno ou encore de Benjamin, ce philosophe issu de l'École de Francfort ne manque pas moins d'attirer la curiosité. On a tôt fait de classer Herbert Marcuse (1898-1979) aux oubliettes alors que sa pensée était peu connue, nonobstant une médiatisation très importante qui a dépassé le cadre classique des universités. Ce philosophe allemand atypique dont le nom surgit notamment suite à la publication de L'Homme unidimensionnel fut abondamment lu mais peu systématisé. Repris à tort et à travers, instrumentalisé aux temps des révoltes étudiantes des années 60, il ne lègue pas moins une pensée qui doit apparaître sous une forme nouvelle et prospective. Cette recherche vise prioritairement à l'extraire de ce brouhaha qui l'enveloppa et menaça de le faire disparaître. Cet essai insiste sur un retour à l'exigence scientifique via l'exhumation des textes méconnus du grand public et une insertion indispensable dans l'histoire des penseurs philosophiques. Accéder à la connaissance de ce philosophe passe par un certain nombre de clefs parmi lesquelles le concept de nature. Pour y arriver, cependant, la quête des fondements d'une philosophie dont les sources sont plurielles et inconciliables s'impose comme étape primordiale et pleine de promesses. A la vérité, le peu de systématisation de la pensée marcusienne est en grande partie liée à cette « prolifération » de références auxquelles s'est adonné Marcuse, laissant une mince passerelle susceptible de dégager une architecture globale et cohérente de sa pensée. Certes, la présentation de Marcuse est restée jusque-là prisonnière de l'influence de Hegel, Marx et Freud. L'auteur de cette thèse tente de démontrer que la pensée marcusienne s'oriente à partir de Kant. Attaché à la tradition philosophique germanique de l'Aufklärung, l'oeuvre du philosophe francfortois combat toutes sortes d'irrationalités qui obstruent la voie menant vers un humanisme réel. La nature reste un concept polémique parce qu'il ne saurait se résumer à l'étant. Ni la nature intérieure, ni la nature extérieure ne se limitent à cet horizon dépourvu de subjectivité. Disciple de Heidegger, Marcuse définit la nature à partir du Dasein, un être-là qui est jeté dans l'histoire et qui porte en lui la qualité de l'historicité. Contre la société dite unidimensionnelle postindustrielle qui annonce l'acmé du capitalisme triomphant, les travaux de Marcuse visent un retour à la nature autant qu'ils font de celle-ci un futur. La nature n'est pas seulement ce qu'elle a été, elle est aussi ce qui est à être. En invalidant le consumérisme ambiant, il décrit la publicité marchande comme un acte de négation des vraies valeurs humaines. Ni la superfluité secrétée par le marché, ni les systèmes communiste (le marxisme soviétique et ses sbires) et capitaliste ne sont capables de provoquer l'idéal de l'humain. Le discours marcusien fécondé par la « Théorie Critique » invente le concept de « Grand Refus » adossé à la dialectique hégélienne, obligeant la conscience à veiller sur le réel pour le conduire vers l'émancipation plutôt que vers un « chemin qui ne mène nulle part ». Attachée à la concrétude et la transformation historique héritée de Marx, il réoriente le Dasein heideggérien en lui donnant plus de chair. Nature et historicité, cette réalité duelle se complique parce qu'elle incarne une difficile révolution qui nécessitera davantage d'imagination. Le « Grand Refus » aura besoin d'un allié plus subtile : l'esthétique qui laisse apparaître la Troisième critique kantienne. Au-delà de ce refuge dans l'art, on aura toujours besoin de Marcuse pour mieux comprendre nos sociétés en pleine mutation et pour habiter en conséquence notre monde perfectible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tractography is a class of algorithms aiming at in vivo mapping the major neuronal pathways in the white matter from diffusion magnetic resonance imaging (MRI) data. These techniques offer a powerful tool to noninvasively investigate at the macroscopic scale the architecture of the neuronal connections of the brain. However, unfortunately, the reconstructions recovered with existing tractography algorithms are not really quantitative even though diffusion MRI is a quantitative modality by nature. As a matter of fact, several techniques have been proposed in recent years to estimate, at the voxel level, intrinsic microstructural features of the tissue, such as axonal density and diameter, by using multicompartment models. In this paper, we present a novel framework to reestablish the link between tractography and tissue microstructure. Starting from an input set of candidate fiber-tracts, which are estimated from the data using standard fiber-tracking techniques, we model the diffusion MRI signal in each voxel of the image as a linear combination of the restricted and hindered contributions generated in every location of the brain by these candidate tracts. Then, we seek for the global weight of each of them, i.e., the effective contribution or volume, such that they globally fit the measured signal at best. We demonstrate that these weights can be easily recovered by solving a global convex optimization problem and using efficient algorithms. The effectiveness of our approach has been evaluated both on a realistic phantom with known ground-truth and in vivo brain data. Results clearly demonstrate the benefits of the proposed formulation, opening new perspectives for a more quantitative and biologically plausible assessment of the structural connectivity of the brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With contributions from leading authors in the most important areas of current research, this book provides insight into the streams that are driving leadership theory and practice today. The Nature of Leadership, Second Edition provides students with an updated and complete yet concise handbook that solidifies and integrates the vast and disparate leadership literature.Key Features of the Second Edition· Provides contributions from twenty-three subject-matter experts-ranging from the eminent to the up-and-coming-giving students an unsurpassed breadth of knowledge and perspective· Organizes the material into the three key thematic areas of Leadership-Science, Nature, and Nurture; the Major Schools of Leadership; and Leadership and Special Domains· Includes nine brand new chapters that provide students with the state-of-the-art of leadership theory and practice such as evolutionary and biological perspectives, individual differences, and shared leadership· Updates the content of seven retained chapters, with reference to recent research and developments in the field· Adds pedagogical features, including discussion questions, a list of practice-focused supplemental readings, and references to case studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is enormous interest in designing training methods for reducing cognitive decline in healthy older adults. Because it is impaired with aging, multitasking has often been targeted and has been shown to be malleable with appropriate training. Investigating the effects of cognitive training on functional brain activation might provide critical indication regarding the mechanisms that underlie those positive effects, as well as provide models for selecting appropriate training methods. The few studies that have looked at brain correlates of cognitive training indicate a variable pattern and location of brain changes - a result that might relate to differences in training formats. The goal of this study was to measure the neural substrates as a function of whether divided attentional training programs induced the use of alternative processes or whether it relied on repeated practice. Forty-eight older adults were randomly allocated to one of three training programs. In the SINGLE REPEATED training, participants practiced an alphanumeric equation and a visual detection task, each under focused attention. In the DIVIDED FIXED training, participants practiced combining verification and detection by divided attention, with equal attention allocated to both tasks. In the DIVIDED VARIABLE training, participants completed the task by divided attention, but were taught to vary the attentional priority allocated to each task. Brain activation was measured with fMRI pre- and post-training while completing each task individually and the two tasks combined. The three training programs resulted in markedly different brain changes. Practice on individual tasks in the SINGLE REPEATED training resulted in reduced brain activation whereas DIVIDED VARIABLE training resulted in a larger recruitment of the right superior and middle frontal gyrus, a region that has been involved in multitasking. The type of training is a critical factor in determining the pattern of brain activation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A noticeable increase in mean temperature has already been observed in Switzerland and summer temperatures up to 4.8 K warmer are expected by 2090. This article reviews the observed impacts of climate change on biodiversity and consider some perspectives for the future at the national level. The following impacts are already evident for all considered taxonomic groups: elevation shifts of distribution toward mountain summits, spread of thermophilous species, colonisation by new species from warmer areas and phenological shifts. Additionally, in the driest areas, increasing droughts are affecting tree survival and fish species are suffering from warm temperatures in lowland regions. These observations are coherent with model projections, and future changes will probably follow the current trends. These changes will likely cause extinctions for alpine species (competition, loss of habitat) and lowland species (temperature or drought stress). In the very urbanised Swiss landscape, the high fragmentation of the natural ecosystems will hinder the dispersal of many species towards mountains. Moreover, disruptions in species interactions caused by individual migration rates or phenological shifts are likely to have consequences for biodiversity. Conversely, the inertia of the ecosystems (species longevity, restricted dispersal) and the local persistence of populations will probably result in lower extinction rates than expected with some models, at least in 21st century. It is thus very difficult to estimate the impact of climate change in terms of species extinctions. A greater recognition by society of the intrinsic value of biodiversity and of its importance for our existence will be essential to put in place effective mitigation measures and to safeguard a maximum number of native species.