994 resultados para dynamic configuration
Resumo:
In dynamic models of energy allocation, assimilated energy is allocated to reproduction, somatic growth, maintenance or storage, and the allocation pattern can change with age. The expected evolutionary outcome is an optimal allocation pattern, but this depends on the environment experienced during the evolutionary process and on the fitness costs and benefits incurred by allocating resources in different ways. Here we review existing treatments which encompass some of the possibilities as regards constant or variable environments and their predictability or unpredictability, and the ways in which production rates and mortality rates depend on body size and composition and age and on the pattern of energy allocation. The optimal policy is to allocate resources where selection pressures are highest, and simultaneous allocation to several body subsystems and reproduction can be optimal if these pressures are equal. This may explain balanced growth commonly observed during ontogeny. Growth ceases at maturity in many models; factors favouring growth after maturity include non-linear trade-offs, variable season length, and production and mortality rates both increasing (or decreasing) functions of body size. We cannot yet say whether these are sufficient to account for the many known cases of growth after maturity and not all reasonable models have yet been explored. Factors favouring storage are also reviewed.
Resumo:
quantiNemo is an individual-based, genetically explicit stochastic simulation program. It was developed to investigate the effects of selection, mutation, recombination and drift on quantitative traits with varying architectures in structured populations connected by migration and located in a heterogeneous habitat. quantiNemo is highly flexible at various levels: population, selection, trait(s) architecture, genetic map for QTL and/or markers, environment, demography, mating system, etc. quantiNemo is coded in C++ using an object-oriented approach and runs on any computer platform. Availability: Executables for several platforms, user's manual, and source code are freely available under the GNU General Public License at http://www2.unil.ch/popgen/softwares/quantinemo.
Resumo:
The assimilation model is a qualitative and integrative approach that enables to study change processes that occur in psychotherapy. According to Stiles, this model conceives the individual's personality as constituent of different voices; the concept of voice is used to describe traces left by past experiences. During the psychotherapy, we can observe the progressive integration of the problematic voices into the patient's personality. We applied the assimilation model to a 34-session-long case of an effective short-term dynamic psychotherapy. We've chosen eight sessions we transcribed and analyzed by establishing points of contact between the case and the theory. The results are presented and discussed in terms of the evolution of the main voices in the patient.
Resumo:
Large Dynamic Message Signs (DMSs) have been increasingly used on freeways, expressways and major arterials to better manage the traffic flow by providing accurate and timely information to drivers. Overhead truss structures are typically employed to support those DMSs allowing them to provide wider display to more lanes. In recent years, there is increasing evidence that the truss structures supporting these large and heavy signs are subjected to much more complex loadings than are typically accounted for in the codified design procedures. Consequently, some of these structures have required frequent inspections, retrofitting, and even premature replacement. Two manufacturing processes are primarily utilized on truss structures - welding and bolting. Recently, cracks at welding toes were reported for the structures employed in some states. Extremely large loads (e.g., due to high winds) could cause brittle fractures, and cyclic vibration (e.g., due to diurnal variation in temperature or due to oscillations in the wind force induced by vortex shedding behind the DMS) may lead to fatigue damage, as these are two major failures for the metallic material. Wind and strain resulting from temperature changes are the main loads that affect the structures during their lifetime. The American Association of State Highway and Transportation Officials (AASHTO) Specification defines the limit loads in dead load, wind load, ice load, and fatigue design for natural wind gust and truck-induced gust. The objectives of this study are to investigate wind and thermal effects in the bridge type overhead DMS truss structures and improve the current design specifications (e.g., for thermal design). In order to accomplish the objective, it is necessary to study structural behavior and detailed strain-stress of the truss structures caused by wind load on the DMS cabinet and thermal load on the truss supporting the DMS cabinet. The study is divided into two parts. The Computational Fluid Dynamics (CFD) component and part of the structural analysis component of the study were conducted at the University of Iowa while the field study and related structural analysis computations were conducted at the Iowa State University. The CFD simulations were used to determine the air-induced forces (wind loads) on the DMS cabinets and the finite element analysis was used to determine the response of the supporting trusses to these pressure forces. The field observation portion consisted of short-term monitoring of several DMS Cabinet/Trusses and long-term monitoring of one DMS Cabinet/Truss. The short-term monitoring was a single (or two) day event in which several message sign panel/trusses were tested. The long-term monitoring field study extended over several months. Analysis of the data focused on trying to identify important behaviors under both ambient and truck induced winds and the effect of daily temperature changes. Results of the CFD investigation, field experiments and structural analysis of the wind induced forces on the DMS cabinets and their effect on the supporting trusses showed that the passage of trucks cannot be responsible for the problems observed to develop at trusses supporting DMS cabinets. Rather the data pointed toward the important effect of the thermal load induced by cyclic (diurnal) variations of the temperature. Thermal influence is not discussed in the specification, either in limit load or fatigue design. Although the frequency of the thermal load is low, results showed that when temperature range is large the restress range would be significant to the structure, especially near welding areas where stress concentrations may occur. Moreover stress amplitude and range are the primary parameters for brittle fracture and fatigue life estimation. Long-term field monitoring of one of the overhead truss structures in Iowa was used as the research baseline to estimate the effects of diurnal temperature changes to fatigue damage. The evaluation of the collected data is an important approach for understanding the structural behavior and for the advancement of future code provisions. Finite element modeling was developed to estimate the strain and stress magnitudes, which were compared with the field monitoring data. Fatigue life of the truss structures was also estimated based on AASHTO specifications and the numerical modeling. The main conclusion of the study is that thermal induced fatigue damage of the truss structures supporting DMS cabinets is likely a significant contributing cause for the cracks observed to develop at such structures. Other probable causes for fatigue damage not investigated in this study are the cyclic oscillations of the total wind load associated with the vortex shedding behind the DMS cabinet at high wind conditions and fabrication tolerances and induced stresses due to fitting of tube to tube connections.
Resumo:
Working memory, commonly defined as the ability to hold mental representations on line transiently and to manipulate these representations, is known to be a core deficit in schizophrenia. The aim of the present study was to investigate the visuo-spatial component of the working memory in schizophrenia, and more precisely to what extent the dynamic visuo-spatial information processing is impaired in schizophrenia patients. For this purpose we used a computerized paradigm in which 29 patients with schizophrenia (DSMIV, Diagnostic Interview for Genetic Studies) and 29 age and sex matched control subjects (DIGS) had to memorize a plane moving across the computer screen and to identify the observed trajectory among 9 plots proposed together. Each trajectory could be seen max. 3 times if needed. The results showed no difference between schizophrenia patients and controls regarding the number of correct trajectory identified after the first presentation. However, when we determine the mean number of correct trajectories on the basis of 3 trials, we observed that schizophrenia patients are significantly less performant than controls (Mann-Whitney, p _ 0.002). These findings suggest that, although schizophrenia patients are able to memorize some dynamic trajectories as well as controls, they do not profit from the repetition of the trajectory presentation. These findings are congruent with the hypothesis that schizophrenia could induce an unbalance between local and global information processing: the patients may be able to focus on details of the trajectory which could allow them to find the right target (bottom-up processes), but may show difficulty to refer to previous experience in order to filter incoming information (top-down processes) and enhance their visuo-spatial working memory abilities.
Resumo:
Quantifying the spatial configuration of hydraulic conductivity (K) in heterogeneous geological environments is essential for accurate predictions of contaminant transport, but is difficult because of the inherent limitations in resolution and coverage associated with traditional hydrological measurements. To address this issue, we consider crosshole and surface-based electrical resistivity geophysical measurements, collected in time during a saline tracer experiment. We use a Bayesian Markov-chain-Monte-Carlo (McMC) methodology to jointly invert the dynamic resistivity data, together with borehole tracer concentration data, to generate multiple posterior realizations of K that are consistent with all available information. We do this within a coupled inversion framework, whereby the geophysical and hydrological forward models are linked through an uncertain relationship between electrical resistivity and concentration. To minimize computational expense, a facies-based subsurface parameterization is developed. The Bayesian-McMC methodology allows us to explore the potential benefits of including the geophysical data into the inverse problem by examining their effect on our ability to identify fast flowpaths in the subsurface, and their impact on hydrological prediction uncertainty. Using a complex, geostatistically generated, two-dimensional numerical example representative of a fluvial environment, we demonstrate that flow model calibration is improved and prediction error is decreased when the electrical resistivity data are included. The worth of the geophysical data is found to be greatest for long spatial correlation lengths of subsurface heterogeneity with respect to wellbore separation, where flow and transport are largely controlled by highly connected flowpaths.
Resumo:
[Abstract]
Resumo:
Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.
Resumo:
Efforts to improve safety and traffic flow through merge areas on high volume/high speed roadways have included early merge and late merge concepts and several studies of the effectiveness of these concepts, many using Intelligent Transportation Systems for implementation. The Iowa Department of Transportation (Iowa DOT) planned to employ a system of dynamic message signs (DMS) to enhance standard temporary traffic control for lane closures and traffic merges at two bridge construction projects in western Iowa (Adair County and Cass County counties) on I-80 during the 2008 construction season. To evaluate the DMS system’s effectiveness for impacting driver merging actions, the Iowa DOT contracted with Iowa State University’s Center for Transportation Research and Education to perform the evaluation and make recommendations for future use of this system based on the results. Data were collected over four weekends, beginning August 1–4 and ending October 16–20, 2008. Two weekends yielded sufficient data for evaluation, one of transition traffic flow and the other with a period of congestion. For both of these periods, a statistical review of collected data did not indicate a significant impact on driver merging actions when the DMS messaging was activated as compared to free flow conditions with no messaging. Collection of relevant project data proved to be problematic for several reasons. In addition to personnel safety issues associated with the placement and retrieval of counting devices on a high speed roadway, unsatisfactory equipment performance and insufficient congestion to activate the DMS messaging hampered efforts. A review of the data that was collected revealed different results taken by the tube counters compared to the older model plate counters. Although variations were not significant from a practical standpoint, a statistical evaluation showed that the data, including volumes, speeds, and classifications from the two sources were not comparable at a 95% level of confidence. Comparison of data from the Iowa DOT’s automated traffic recorders (ATRs) in the area also suggested variations in results from these data collection systems. Additional comparison studies were recommended.
Resumo:
Emotion regulation is crucial for successfully engaging in social interactions. Yet, little is known about the neural mechanisms controlling behavioral responses to emotional expressions perceived in the face of other people, which constitute a key element of interpersonal communication. Here, we investigated brain systems involved in social emotion perception and regulation, using functional magnetic resonance imaging (fMRI) in 20 healthy participants. The latter saw dynamic facial expressions of either happiness or sadness, and were asked to either imitate the expression or to suppress any expression on their own face (in addition to a gender judgment control task). fMRI results revealed higher activity in regions associated with emotion (e.g., the insula), motor function (e.g., motor cortex), and theory of mind (e.g., [pre]cuneus) during imitation. Activity in dorsal cingulate cortex was also increased during imitation, possibly reflecting greater action monitoring or conflict with own feeling states. In addition, premotor regions were more strongly activated during both imitation and suppression, suggesting a recruitment of motor control for both the production and inhibition of emotion expressions. Expressive suppression (eSUP) produced increases in dorsolateral and lateral prefrontal cortex typically related to cognitive control. These results suggest that voluntary imitation and eSUP modulate brain responses to emotional signals perceived from faces, by up- and down-regulating activity in distributed subcortical and cortical networks that are particularly involved in emotion, action monitoring, and cognitive control.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
The silicon photomultiplier (SiPM) is a novel detector technology that has undergone a fast development in the last few years, owing to its single-photon resolution and ultra-fast response time. However, the typical high dark count rates of the sensor may prevent the detection of low intensity radiation fluxes. In this article, the time-gated operation with short active periods in the nanosecond range is proposed as a solution to reduce the number of cells fired due to noise and thus increase the dynamic range. The technique is aimed at application fields that function under a trigger command, such as gated fluorescence lifetime imaging microscopy.