976 resultados para Future Plans
Resumo:
Research on future episodic thought has produced compelling theories and results in cognitive psychology, cognitive neuroscience, and clinical psychology. In experiments aimed to integrate these with basic concepts and methods from autobiographical memory research, 76 undergraduates remembered past and imagined future positive and negative events that had or would have a major impact on them. Correlations of the online ratings of visual and auditory imagery, emotion, and other measures demonstrated that individuals used the same processes to the same extent to remember past and construct future events. These measures predicted the theoretically important metacognitive judgment of past reliving and future "preliving" in similar ways. On standardized tests of reactions to traumatic events, scores for future negative events were much higher than scores for past negative events. The scores for future negative events were in the range that would qualify for a diagnosis of posttraumatic stress disorder (PTSD); the test was replicated (n = 52) to check for order effects. Consistent with earlier work, future events had less sensory vividness. Thus, the imagined symptoms of future events were unlikely to be caused by sensory vividness. In a second experiment, to confirm this, 63 undergraduates produced numerous added details between 2 constructions of the same negative future events; deficits in rated vividness were removed with no increase in the standardized tests of reactions to traumatic events. Neuroticism predicted individuals' reactions to negative past events but did not predict imagined reactions to future events. This set of novel methods and findings is interpreted in the contexts of the literatures of episodic future thought, autobiographical memory, PTSD, and classic schema theory.
Resumo:
© 2014, Springer-Verlag Berlin Heidelberg.The frequency and severity of extreme events are tightly associated with the variance of precipitation. As climate warms, the acceleration in hydrological cycle is likely to enhance the variance of precipitation across the globe. However, due to the lack of an effective analysis method, the mechanisms responsible for the changes of precipitation variance are poorly understood, especially on regional scales. Our study fills this gap by formulating a variance partition algorithm, which explicitly quantifies the contributions of atmospheric thermodynamics (specific humidity) and dynamics (wind) to the changes in regional-scale precipitation variance. Taking Southeastern (SE) United States (US) summer precipitation as an example, the algorithm is applied to the simulations of current and future climate by phase 5 of Coupled Model Intercomparison Project (CMIP5) models. The analysis suggests that compared to observations, most CMIP5 models (~60 %) tend to underestimate the summer precipitation variance over the SE US during the 1950–1999, primarily due to the errors in the modeled dynamic processes (i.e. large-scale circulation). Among the 18 CMIP5 models analyzed in this study, six of them reasonably simulate SE US summer precipitation variance in the twentieth century and the underlying physical processes; these models are thus applied for mechanistic study of future changes in SE US summer precipitation variance. In the future, the six models collectively project an intensification of SE US summer precipitation variance, resulting from the combined effects of atmospheric thermodynamics and dynamics. Between them, the latter plays a more important role. Specifically, thermodynamics results in more frequent and intensified wet summers, but does not contribute to the projected increase in the frequency and intensity of dry summers. In contrast, atmospheric dynamics explains the projected enhancement in both wet and dry summers, indicating its importance in understanding future climate change over the SE US. The results suggest that the intensified SE US summer precipitation variance is not a purely thermodynamic response to greenhouse gases forcing, and cannot be explained without the contribution of atmospheric dynamics. Our analysis provides important insights to understand the mechanisms of SE US summer precipitation variance change. The algorithm formulated in this study can be easily applied to other regions and seasons to systematically explore the mechanisms responsible for the changes in precipitation extremes in a warming climate.
Resumo:
Carbon markets are substantial and they are expanding. There are many lessons from experiences over the past eight years: fewer free allowances, better management of market-sensitive information, and a recognition that trading systems require adjustments that have consequences for market participants and market confidence. Moreover, the emerging international architecture features separate emissions trading systems serving distinct jurisdictions. These programs are complemented by a variety of other types of policies alongside the carbon markets. This sits in sharp contrast to the integrated global trading architecture envisioned 15 years ago by the designers of the Kyoto Protocol and raises a suite of new questions. In this new architecture, jurisdictions with emissions trading have to decide how, whether, and when to link with one another, and policymakers overseeing carbon markets must confront how to measure the comparability of efforts among markets and relative to a variety of other policy approaches.
Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.
Resumo:
© 2014 by Annual Reviews.Carbon markets are substantial and expanding. There are many lessons from experience over the past 9 years: fewer free allowances, careful moderation of low and high prices, and a recognition that trading systems require adjustments that have consequences for market participants and market confidence. Moreover, the emerging international architecture features separate emissions trading systems serving distinct jurisdictions. These programs are complemented by a variety of other types of policies alongside the carbon markets. This architecture sits in sharp contrast to the integrated global trading architecture envisioned 15 years ago by the designers of the Kyoto Protocol and raises a suite of new questions. In this new architecture, jurisdictions with emissions trading have to decide how, whether, and when to link with one another, and policy makers must confront how to measure both the comparability of efforts among markets and the comparability between markets and a variety of other policy approaches.
Resumo:
© 2015, Jon C. Giullian and Ernest A. Zitser.The proliferation of research guides created using the LibGuides platform has triggered extensive discussion touting their benefits for everything from assessment, engagement, and marketing, to outreach and pedagogy. However, there is at present a relative paucity of critical reflection about the product’s place in the broader informational landscape. This article is an attempt to redress this lacuna. Relying primarily on examples from the field of Slavic, East European, and Eurasian studies, the authors briefly describe the evolution of online research guides; identify reasons for the proliferation of Springshare’s product in academic libraries; question whether LibGuides improve learning or reinforce information inequality in higher education; and propose a way to move beyond LibGuides.
Resumo:
BACKGROUND: Singapore's population, as that of many other countries, is aging; this is likely to lead to an increase in eye diseases and the demand for eye care. Since ophthalmologist training is long and expensive, early planning is essential. This paper forecasts workforce and training requirements for Singapore up to the year 2040 under several plausible future scenarios. METHODS: The Singapore Eye Care Workforce Model was created as a continuous time compartment model with explicit workforce stocks using system dynamics. The model has three modules: prevalence of eye disease, demand, and workforce requirements. The model is used to simulate the prevalence of eye diseases, patient visits, and workforce requirements for the public sector under different scenarios in order to determine training requirements. RESULTS: Four scenarios were constructed. Under the baseline business-as-usual scenario, the required number of ophthalmologists is projected to increase by 117% from 2015 to 2040. Under the current policy scenario (assuming an increase of service uptake due to increased awareness, availability, and accessibility of eye care services), the increase will be 175%, while under the new model of care scenario (considering the additional effect of providing some services by non-ophthalmologists) the increase will only be 150%. The moderated workload scenario (assuming in addition a reduction of the clinical workload) projects an increase in the required number of ophthalmologists of 192% by 2040. Considering the uncertainties in the projected demand for eye care services, under the business-as-usual scenario, a residency intake of 8-22 residents per year is required, 17-21 under the current policy scenario, 14-18 under the new model of care scenario, and, under the moderated workload scenario, an intake of 18-23 residents per year is required. CONCLUSIONS: The results show that under all scenarios considered, Singapore's aging and growing population will result in an almost doubling of the number of Singaporeans with eye conditions, a significant increase in public sector eye care demand and, consequently, a greater requirement for ophthalmologists.
Resumo:
The model: groups of Lie-Chevalley type and buildingsThis paper is not the presentation of a completed theory but rather a report on a search progressing as in the natural sciences in order to better understand the relationship between groups and incidence geometry, in some future sought-after theory Τ. The search is based on assumptions and on wishes some of which are time-dependent, variations being forced, in particular, by the search itself.A major historical reference for this subject is, needless to say, Klein's Erlangen Programme. Klein's views were raised to a powerful theory thanks to the geometric interpretation of the simple Lie groups due to Tits (see for instance), particularly his theory of buildings and of groups with a BN-pair (or Tits systems). Let us briefly recall some striking features of this.Let G be a group of Lie-Chevalley type of rank r, denned over GF(q), q = pn, p prime. Let Xr denote the Dynkin diagram of G. To these data corresponds a unique thick building B(G) of rank r over the Coxeter diagram Xr (assuming we forget arrows provided by the Dynkin diagram). It turns out that B(G) can be constructed in a uniform way for all G, from a fixed p-Sylow subgroup U of G, its normalizer NG(U) and the r maximal subgroups of G containing NG(U).
Resumo:
This paper presents a generic framework that can be used to describe study plans using meta-data. The context of this research and associated technologies and standards is presented. The approach adopted here has been developed within the mENU project that aims to provide a model for a European Networked University. The methodology for the design of the generic Framework is discussed and the main design requirements are presented. The approach adopted was based on a set of templates containing meta-data required for the description of programs of study and consisting of generic building elements annotated appropriately. The process followed to develop the templates is presented together with a set of evaluation criteria to test the suitability of the approach. The templates structure is presented and example templates are shown. A first evaluation of the approach has shown that the proposed framework can provide a flexible and competent means for the generic description of study plans for the purposes of a networked university.
Resumo:
The fabrication, assembly and testing of electronic packaging can involve complex interactions between physical phenomena such as temperature, fluid flow, electromagnetics, and stress. Numerical modelling and optimisation tools are key computer-aided-engineering technologies that aid design engineers. This paper discusses these technologies and there future developments.
Resumo:
At present the vast majority of Computer-Aided- Engineering (CAE) analysis calculations for microelectronic and microsystems technologies are undertaken using software tools that focus on single aspects of the physics taking place. For example, the design engineer may use one code to predict the airflow and thermal behavior of an electronic package, then another code to predict the stress in solder joints, and then yet another code to predict electromagnetic radiation throughout the system. The reason for this focus of mesh-based codes on separate parts of the governing physics is essentially due to the numerical technologies used to solve the partial differential equations, combined with the subsequent heritage structure in the software codes. Using different software tools, that each requires model build and meshing, leads to a large investment in time, and hence cost, to undertake each of the simulations. During the last ten years there has been significant developments in the modelling community around multi- physics analysis. These developments are being followed by many of the code vendors who are now providing multi-physics capabilities in their software tools. This paper illustrates current capabilities of multi-physics technology and highlights some of the future challenges
Resumo:
Future analysis tools that predict the behavior of electronic components, both during qualification testing and in-service lifetime assessment, will be very important in predicting product reliability and identifying when to undertake maintenance. This paper will discuss some of these techniques and illustrate these with examples. The paper will also discuss future challenges for these techniques.