883 resultados para Forecasting.
Resumo:
© 2014, Springer-Verlag Berlin Heidelberg.This study assesses the skill of advanced regional climate models (RCMs) in simulating southeastern United States (SE US) summer precipitation and explores the physical mechanisms responsible for the simulation skill at a process level. Analysis of the RCM output for the North American Regional Climate Change Assessment Program indicates that the RCM simulations of summer precipitation show the largest biases and a remarkable spread over the SE US compared to other regions in the contiguous US. The causes of such a spread are investigated by performing simulations using the Weather Research and Forecasting (WRF) model, a next-generation RCM developed by the US National Center for Atmospheric Research. The results show that the simulated biases in SE US summer precipitation are due mainly to the misrepresentation of the modeled North Atlantic subtropical high (NASH) western ridge. In the WRF simulations, the NASH western ridge shifts 7° northwestward when compared to that in the reanalysis ensemble, leading to a dry bias in the simulated summer precipitation according to the relationship between the NASH western ridge and summer precipitation over the southeast. Experiments utilizing the four dimensional data assimilation technique further suggest that the improved representation of the circulation patterns (i.e., wind fields) associated with the NASH western ridge substantially reduces the bias in the simulated SE US summer precipitation. Our analysis of circulation dynamics indicates that the NASH western ridge in the WRF simulations is significantly influenced by the simulated planetary boundary layer (PBL) processes over the Gulf of Mexico. Specifically, a decrease (increase) in the simulated PBL height tends to stabilize (destabilize) the lower troposphere over the Gulf of Mexico, and thus inhibits (favors) the onset and/or development of convection. Such changes in tropical convection induce a tropical–extratropical teleconnection pattern, which modulates the circulation along the NASH western ridge in the WRF simulations and contributes to the modeled precipitation biases over the SE US. In conclusion, our study demonstrates that the NASH western ridge is an important factor responsible for the RCM skill in simulating SE US summer precipitation. Furthermore, the improvements in the PBL parameterizations for the Gulf of Mexico might help advance RCM skill in representing the NASH western ridge circulation and summer precipitation over the SE US.
Resumo:
We all experience a host of common life stressors such as the death of a family member, medical illness, and financial uncertainty. While most of us are resilient to such stressors, continuing to function normally, for a subset of individuals, experiencing these stressors increases the likelihood of developing treatment-resistant, chronic psychological problems, including depression and anxiety. It is thus paramount to identify predictive markers of risk, particularly those reflecting fundamental biological processes that can be targets for intervention and prevention. Using data from a longitudinal study of 340 healthy young adults, we demonstrate that individual differences in threat-related amygdala reactivity predict psychological vulnerability to life stress occurring as much as 1 to 4 years later. These results highlight a readily assayed biomarker, threat-related amygdala reactivity, which predicts psychological vulnerability to commonly experienced stressors and represents a discrete target for intervention and prevention.
Resumo:
Research on future episodic thought has produced compelling theories and results in cognitive psychology, cognitive neuroscience, and clinical psychology. In experiments aimed to integrate these with basic concepts and methods from autobiographical memory research, 76 undergraduates remembered past and imagined future positive and negative events that had or would have a major impact on them. Correlations of the online ratings of visual and auditory imagery, emotion, and other measures demonstrated that individuals used the same processes to the same extent to remember past and construct future events. These measures predicted the theoretically important metacognitive judgment of past reliving and future "preliving" in similar ways. On standardized tests of reactions to traumatic events, scores for future negative events were much higher than scores for past negative events. The scores for future negative events were in the range that would qualify for a diagnosis of posttraumatic stress disorder (PTSD); the test was replicated (n = 52) to check for order effects. Consistent with earlier work, future events had less sensory vividness. Thus, the imagined symptoms of future events were unlikely to be caused by sensory vividness. In a second experiment, to confirm this, 63 undergraduates produced numerous added details between 2 constructions of the same negative future events; deficits in rated vividness were removed with no increase in the standardized tests of reactions to traumatic events. Neuroticism predicted individuals' reactions to negative past events but did not predict imagined reactions to future events. This set of novel methods and findings is interpreted in the contexts of the literatures of episodic future thought, autobiographical memory, PTSD, and classic schema theory.
Resumo:
BACKGROUND: Singapore's population, as that of many other countries, is aging; this is likely to lead to an increase in eye diseases and the demand for eye care. Since ophthalmologist training is long and expensive, early planning is essential. This paper forecasts workforce and training requirements for Singapore up to the year 2040 under several plausible future scenarios. METHODS: The Singapore Eye Care Workforce Model was created as a continuous time compartment model with explicit workforce stocks using system dynamics. The model has three modules: prevalence of eye disease, demand, and workforce requirements. The model is used to simulate the prevalence of eye diseases, patient visits, and workforce requirements for the public sector under different scenarios in order to determine training requirements. RESULTS: Four scenarios were constructed. Under the baseline business-as-usual scenario, the required number of ophthalmologists is projected to increase by 117% from 2015 to 2040. Under the current policy scenario (assuming an increase of service uptake due to increased awareness, availability, and accessibility of eye care services), the increase will be 175%, while under the new model of care scenario (considering the additional effect of providing some services by non-ophthalmologists) the increase will only be 150%. The moderated workload scenario (assuming in addition a reduction of the clinical workload) projects an increase in the required number of ophthalmologists of 192% by 2040. Considering the uncertainties in the projected demand for eye care services, under the business-as-usual scenario, a residency intake of 8-22 residents per year is required, 17-21 under the current policy scenario, 14-18 under the new model of care scenario, and, under the moderated workload scenario, an intake of 18-23 residents per year is required. CONCLUSIONS: The results show that under all scenarios considered, Singapore's aging and growing population will result in an almost doubling of the number of Singaporeans with eye conditions, a significant increase in public sector eye care demand and, consequently, a greater requirement for ophthalmologists.
Resumo:
BACKGROUND: The development of a microcomputer-based device permits quick, simple, and noninvasive quantification of the respiratory sinus arrhythmia (RSA) during quiet breathing. METHODS AND RESULTS: We prospectively and serially measured the radionuclide left ventricular ejection fraction and the RSA amplitude in 34 cancer patients receiving up to nine monthly bolus treatments with doxorubicin hydrochloride (60 mg/m2). Of the eight patients who ultimately developed symptomatic doxorubicin-induced congestive heart failure, seven (87.5%) demonstrated a significant decline in RSA amplitude; five of 26 subjects without clinical symptoms of cardiotoxicity (19.2%) showed a similar RSA amplitude decline. On average, significant RSA amplitude decline occurred 3 months before the last planned doxorubicin dose in patients destined to develop clinical congestive heart failure. CONCLUSION: Overall, RSA amplitude abnormality proved to be a more specific predictor of clinically significant congestive heart failure than did serial resting radionuclide ejection fractions.
Resumo:
Modeling of global climate change is moving from global circulation model (GCM)-type projections with coupled biogeochemical models to projections of ecological responses, including food web and upper trophic levels. Marine and coastal ecosystems are highly susceptible to the impacts of global climate change and also produce significant ecosystem services. The effects of global climate change on coastal and marine ecosystems involve a much wider array of effects than the usual temperature, sea level rise, and precipitation. This paper is an overview for a collection of 12 papers that examined various aspects of global climate change on marine ecosystems and comprise this special issue. We summarized the major features of the models and analyses in the papers to determine general patterns. A wide range of ecosystems were simulated using a diverse set of modeling approaches. Models were either 3-dimensional or used a few spatial boxes, and responses to global climate change were mostly expressed as changes from a baseline condition. Three issues were identified from the across-model comparison: (a) lack of standardization of climate change scenarios, (b) the prevalence of site-specific and even unique models for upper trophic levels, and (c) emphasis on hypothesis evaluation versus forecasting. We discuss why these issues are important as global climate change assessment continues to progress up the food chain, and, when possible, offer some initial steps for going forward.
Resumo:
A single tidal cycle survey in a Lagrangian reference frame was conducted in autumn 2010 to evaluate the impact of short-term, episodic and enhanced turbulent mixing on large chain-forming phytoplankton. Observations of turbulence using a free-falling microstructure profiler were undertaken, along with near-simultaneous profiles with an in-line digital holographic camera at station L4 (50° 15′ N 4° 13′ W, depth 50 m) in the Western English Channel. Profiles from each instrument were collected hourly whilst following a drogued drifter. Results from an ADCP attached to the drifter showed pronounced vertical shear, indicating that the water column structure consisted of two layers, restricting interpretation of the Lagrangian experiment to the upper ~ 25 m. Atmospheric conditions deteriorated during the mid-point of the survey, resulting in values of turbulent dissipation reaching a maximum of 10− 4 W kg− 1 toward the surface in the upper 10 m. Chain-forming phytoplankton > 200 μm were counted using the data from the holographic camera for the two periods, before and after the enhanced mixing event. As mixing increased phytoplankton underwent chain breakage, were dispersed by advection through their removal from the upper to lower layer and subjected to aggregation with other suspended material. Depth averaged counts of phytoplankton were reduced from a maximum of around 2050 L− 1 before the increased turbulence, to 1070 L− 1 after, with each of these mechanisms contributing to this reduction. These results demonstrate the sensitivity of phytoplantkon populations to moderate increases in turbulent activity, yielding consequences for accurate forecasting of the role played by phytoplankton in climate studies and also for the ecosystem in general in their role as primary producers.
Resumo:
One habitat management requirement forced by 21st century relative sea-level rise (RSLR), will be the need to re-comprehend the dimensions of long-term transgressive behaviour of coastal systems being forced by such RSLR. Fresh approaches to the conceptual modelling and subsequent implementation of new coastal and peri-marine habitats will be required. There is concern that existing approaches to forecasting coastal systems development (and by implication their associated scarce coastal habitats) over the next century depend on a certain premise of orderly spatial succession of habitats. This assumption is shown to be questionable given the possible future rates of RSLR, magnitude of shoreline retreat and the lack of coastal sediment to maintain the protective morphologies to low-energy coastal habitats. Of these issues, sediment deficiency is regarded as one of the major problem for future habitat development. Examples of contemporary behaviour of UK coasts show evidence of coastal sediment starvation resulting from relatively stable RSLR, anthropogenic sealing of coastal sources, and intercepted coastal sediment pathways, which together force segmentation of coastal systems. From these examples key principles are deduced which may prejudice the existence of future habitats: accelerated future sediment demand due to RSLR may not be met by supply and, if short- to medium-term hold-the-line policies predominate, long-term strategies for managed realignment and habitat enhancement may prove impossible goals. Methods of contemporary sediment husbandry may help sustain some habitats in place but otherwise, instead of integrated coastal organization, managers may need to consider coastal breakdown, segmentation and habitat reduction as the basis of 21st century coastal evolution and planning.
Resumo:
We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
Resumo:
We propose a simple and flexible framework for forecasting the joint density of asset returns. The multinormal distribution is augmented with a polynomial in (time-varying) non-central co-moments of assets. We estimate the coefficients of the polynomial via the Method of Moments for a carefully selected set of co-moments. In an extensive empirical study, we compare the proposed model with a range of other models widely used in the literature. Employing a recently proposed as well as standard techniques to evaluate multivariate forecasts, we conclude that the augmented joint density provides highly accurate forecasts of the “negative tail” of the joint distribution.
Resumo:
A significant part of the literature on input-output (IO) analysis is dedicated to the development and application of methodologies forecasting and updating technology coefficients and multipliers. Prominent among such techniques is the RAS method, while more information demanding econometric methods, as well as other less promising ones, have been proposed. However, there has been little interest expressed in the use of more modern and often more innovative methods, such as neural networks in IO analysis in general. This study constructs, proposes and applies a Backpropagation Neural Network (BPN) with the purpose of forecasting IO technology coefficients and subsequently multipliers. The RAS method is also applied on the same set of UK IO tables, and the discussion of results of both methods is accompanied by a comparative analysis. The results show that the BPN offers a valid alternative way of IO technology forecasting and many forecasts were more accurate using this method. Overall, however, the RAS method outperformed the BPN but the difference is rather small to be systematic and there are further ways to improve the performance of the BPN.
Resumo:
Stochastic modeling of mortality rates focuses on fitting linear models to logarithmically adjusted mortality data from the middle or late ages. Whilst this modeling enables insurers to project mortality rates and hence price mortality products it does not provide good fit for younger aged mortality. Mortality rates below the early 20's are important to model as they give an insight into estimates of the cohort effect for more recent years of birth. It is also important given the cumulative nature of life expectancy to be able to forecast mortality improvements at all ages. When we attempt to fit existing models to a wider age range, 5-89, rather than 20-89 or 50-89, their weaknesses are revealed as the results are not satisfactory. The linear innovations in existing models are not flexible enough to capture the non-linear profile of mortality rates that we see at the lower ages. In this paper we modify an existing 4 factor model of mortality to enable better fitting to a wider age range, and using data from seven developed countries our empirical results show that the proposed model has a better fit to the actual data, is robust, and has good forecasting ability.
Resumo:
Background: Rift Valley fever (RVF) is an emerging vector-borne zoonotic disease that represents a threat to human health, animal health, and livestock production, particularly in Africa. The epidemiology of RVF is not well understood, so that forecasting RVF outbreaks and carrying out efficient and timely control measures remains a challenge. Various epidemiological modeling tools have been used to increase knowledge on RVF epidemiology and to inform disease management policies.
Resumo:
EU Directive 2009/28/EC on Renewable Energy requires each Member State to ensure 10% of transport energy (excluding aviation and marine transport) comes from renewable sources by 2020 (10% RES-T target). In addition to the anticipated growth in biofuels, this target is expected to be met by the increased electrification of transport coupled with a growing contribution from renewable energy to electricity generation. Energy use in transport accounted for nearly half of Ireland’s total final energy demand and about a third of energy-related carbon dioxide emissions in 2007. Energy use in transport has grown by 6.3% per annum on average in the period 1990 – 2007. This high share and fast growth relative to other countries highlights the challenges Ireland faces in meeting ambitious renewable energy targets. The Irish Government has set a specific target for Electric Vehicles (EV) as part of its strategy to deliver the 10% RES-T target. By 2020, 10% of all vehicles in its transport fleet are to be powered by electricity. This paper quantifies the impacts on energy and carbon dioxide emissions of this 10% EV target by 2020. In order to do this an ‘EV Car Stock’ model was developed to analyse the historical and future make-up of the passenger car portion of the fleet to 2025. Three scenarios for possible take-up in EVs were examined and the associated energy and emissions impacts are quantified. These impacts are then compared to Ireland’s 10% RES-T target and greenhouse gas (GHG) emissions reduction targets for 2020. Two key findings of the study are that the 10% EV target contributes 1.7% to the 10% RES-T target by 2020 and 1.4% to the 20% reduction in Non-ETS emissions by 2020 relative to 2005.
Resumo:
We propose an exchange rate model that is a hybrid of the conventional specification with monetary fundamentals and the Evans–Lyons microstructure approach. We estimate a model augmented with order flow variables, using a unique data set: almost 100 monthly observations on interdealer order flow on dollar/euro and dollar/yen. The augmented macroeconomic, or “hybrid,” model exhibits greater in-sample stability and out of sample forecasting improvement vis-à-vis the basic macroeconomic and random walk specifications.