929 resultados para Event study methodology
Resumo:
The Holocene vegetation history of the Arabian Peninsula is poorly understood, with few palaeobotanical studies to date. At Awafi, Ras al-Khaimah, UAE, a 3.3 m lake sediment sequence records the vegetation development for the period 8500 cal. yr BP to similar to3000 cal. yr BP. delta(13)C isotope, pollen and phytolith analyses indicate that C3 Pooid grassland with a strong woody element existed during the early Holocene (between 8500 and 6000 cal. yr BP) and became replaced by mixed C3 and C4 grasses with a strong C4 Panicoid tall grass element between 5900 and 5400 cal. yr BP. An intense, arid event Occurred at 4100 cal. yr BP when the lake desiccated and was infilled by Aeolian sand. From 4100 cal. yr BP the vegetation was dominated by C4 Chloridoid types and Cyperaceae, suggesting an incomplete vegetation cover and Aeolian dune reactivation owing to increased regional aridity. These data outline the ecosystem dynamics and carbon cycling in response to palaeomon-soon and north-westerly variability during the Holocene. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Name agreement is the extent to which different people agree on a name for a particular picture. Previous studies have found that it takes longer to name low name agreement pictures than high name agreement pictures. To examine the effect of name agreement in the online process of picture naming, we compared event-related potentials (ERPs) recorded whilst 19 healthy, native English speakers silently named pictures which had either high or low name agreement. A series of ERP components was examined: P1 approximately 120ms from picture onset, N1 around 170ms, P2 around 220ms, N2 around 290ms, and P3 around 400ms. Additionally, a late time window from 800 to 900ms was considered. Name agreement had an early effect, starting at P1 and possibly resulting from uncertainty of picture identity, and continuing into N2, possibly resulting from alternative names for pictures. These results support the idea that name agreement affects two consecutive processes: first, object recognition, and second, lexical selection and/or phonological encoding.
Case study of the use of remotely sensed data for modeling flood inundation on the river Severn, UK.
Resumo:
A methodology for using remotely sensed data to both generate and evaluate a hydraulic model of floodplain inundation is presented for a rural case study in the United Kingdom: Upton-upon-Severn. Remotely sensed data have been processed and assembled to provide an excellent test data set for both model construction and validation. In order to assess the usefulness of the data and the issues encountered in their use, two models for floodplain inundation were constructed: one based on an industry standard one-dimensional approach and the other based on a simple two-dimensional approach. The results and their implications for the future use of remotely sensed data for predicting flood inundation are discussed. Key conclusions for the use of remotely sensed data are that care must be taken to integrate different data sources for both model construction and validation and that improvements in ground height data shift the focus in terms of model uncertainties to other sources such as boundary conditions. The differences between the two models are found to be of minor significance.
Resumo:
In this study, the processes affecting sea surface temperature variability over the 1992–98 period, encompassing the very strong 1997–98 El Niño event, are analyzed. A tropical Pacific Ocean general circulation model, forced by a combination of weekly ERS1–2 and TAO wind stresses, and climatological heat and freshwater fluxes, is first validated against observations. The model reproduces the main features of the tropical Pacific mean state, despite a weaker than observed thermal stratification, a 0.1 m s−1 too strong (weak) South Equatorial Current (North Equatorial Countercurrent), and a slight underestimate of the Equatorial Undercurrent. Good agreement is found between the model dynamic height and TOPEX/Poseidon sea level variability, with correlation/rms differences of 0.80/4.7 cm on average in the 10°N–10°S band. The model sea surface temperature variability is a bit weak, but reproduces the main features of interannual variability during the 1992–98 period. The model compares well with the TAO current variability at the equator, with correlation/rms differences of 0.81/0.23 m s−1 for surface currents. The model therefore reproduces well the observed interannual variability, with wind stress as the only interannually varying forcing. This good agreement with observations provides confidence in the comprehensive three-dimensional circulation and thermal structure of the model. A close examination of mixed layer heat balance is thus undertaken, contrasting the mean seasonal cycle of the 1993–96 period and the 1997–98 El Niño. In the eastern Pacific, cooling by exchanges with the subsurface (vertical advection, mixing, and entrainment), the atmospheric forcing, and the eddies (mainly the tropical instability waves) are the three main contributors to the heat budget. In the central–western Pacific, the zonal advection by low-frequency currents becomes the main contributor. Westerly wind bursts (in December 1996 and March and June 1997) were found to play a decisive role in the onset of the 1997–98 El Niño. They contributed to the early warming in the eastern Pacific because the downwelling Kelvin waves that they excited diminished subsurface cooling there. But it is mainly through eastward advection of the warm pool that they generated temperature anomalies in the central Pacific. The end of El Niño can be linked to the large-scale easterly anomalies that developed in the western Pacific and spread eastward, from the end of 1997 onward. In the far-western Pacific, because of the shallower than normal thermocline, these easterlies cooled the SST by vertical processes. In the central Pacific, easterlies pushed the warm pool back to the west. In the east, they led to a shallower thermocline, which ultimately allowed subsurface cooling to resume and to quickly cool the surface layer.
Resumo:
We use proper orthogonal decomposition (POD) to study a transient teleconnection event at the onset of the 2001 planet-encircling dust storm on Mars, in terms of empirical orthogonal functions (EOFs). There are several differences between this and previous studies of atmospheric events using EOFs. First, instead of using a single variable such as surface pressure or geopotential height on a given pressure surface, we use a dataset describing the evolution in time of global and fully three-dimensional atmospheric fields such as horizontal velocity and temperature. These fields are produced by assimilating Thermal Emission Spectrometer observations from NASA's Mars Global Surveyor spacecraft into a Mars general circulation model. We use total atmospheric energy (TE) as a physically meaningful quantity which weights the state variables. Second, instead of adopting the EOFs to define teleconnection patterns as planetary-scale correlations that explain a large portion of long time-scale variability, we use EOFs to understand transient processes due to localised heating perturbations that have implications for the atmospheric circulation over distant regions. The localised perturbation is given by anomalous heating due to the enhanced presence of dust around the northern edge of the Hellas Planitia basin on Mars. We show that the localised disturbance is seemingly restricted to a small number (a few tens) of EOFs. These can be classified as low-order, transitional, or high-order EOFs according to the TE amount they explain throughout the event. Despite the global character of the EOFs, they show the capability of accounting for the localised effects of the perturbation via the presence of specific centres of action. We finally discuss possible applications for the study of terrestrial phenomena with similar characteristics.
Resumo:
We perform a numerical study of the evolution of a Coronal Mass Ejection (CME) and its interaction with the coronal magnetic field based on the 12 May 1997, CME event using a global MagnetoHydroDynamic (MHD) model for the solar corona. The ambient solar wind steady-state solution is driven by photospheric magnetic field data, while the solar eruption is obtained by superimposing an unstable flux rope onto the steady-state solution. During the initial stage of CME expansion, the core flux rope reconnects with the neighboring field, which facilitates lateral expansion of the CME footprint in the low corona. The flux rope field also reconnects with the oppositely orientated overlying magnetic field in the manner of the breakout model. During this stage of the eruption, the simulated CME rotates counter-clockwise to achieve an orientation that is in agreement with the interplanetary flux rope observed at 1 AU. A significant component of the CME that expands into interplanetary space comprises one of the side lobes created mainly as a result of reconnection with the overlying field. Within 3 hours, reconnection effectively modifies the CME connectivity from the initial condition where both footpoints are rooted in the active region to a situation where one footpoint is displaced into the quiet Sun, at a significant distance (≈1R ) from the original source region. The expansion and rotation due to interaction with the overlying magnetic field stops when the CME reaches the outer edge of the helmet streamer belt, where the field is organized on a global scale. The simulation thus offers a new view of the role reconnection plays in rotating a CME flux rope and transporting its footpoints while preserving its core structure.
Resumo:
This paper presents a case study to illustrate the range of decisions involved in designing a sampling strategy for a complex, longitudinal research study. It is based on experience from the Young Lives project and identifies the approaches used to sample children for longitudinal follow-up in four less developed countries (LDCs). The rationale for decisions made and the resulting benefits, and limitations, of the approaches adopted are discussed. Of particular importance is the choice of sampling approach to yield useful analysis; specific examples are presented of how this informed the design of the Young Lives sampling strategy.
Resumo:
Bayesian decision procedures have recently been developed for dose escalation in phase I clinical trials concerning pharmacokinetic responses observed in healthy volunteers. This article describes how that general methodology was extended and evaluated for implementation in a specific phase I trial of a novel compound. At the time of writing, the study is ongoing, and it will be some time before the sponsor will wish to put the results into the public domain. This article is an account of how the study was designed in a way that should prove to be safe, accurate, and efficient whatever the true nature of the compound. The study involves the observation of two pharmacokinetic endpoints relating to the plasma concentration of the compound itself and of a metabolite as well as a safety endpoint relating to the occurrence of adverse events. Construction of the design and its evaluation via simulation are presented.
Resumo:
We report an inelastic neutron scattering (INS) study of the rotational–vibrational spectrum of dihydrogen sorbed by zeolite CaX. In the low energy (<200 cm−1) INS spectrum of adsorbed H2 we observe the rotational–vibrational spectrum of H2, where the vibration is that of the H2 molecule against the binding site (i.e. H2–X, not H–H). We have observed for the first time the vibrational overtones of the hydrogen molecule against the adsorption surface up to sixth order. These vibrations are usually forbidden in INS spectroscopy because of the selection rules imposed by the spin flip event required. In our case we are able to observe such a vibration because the rotational transition J(1 ← 0) convolutes the vibrational spectrum. This paper reports the effect for the first time.
Resumo:
Purpose – To evaluate the control strategy for a hybrid natural ventilation wind catchers and air-conditioning system and to assess the contribution of wind catchers to indoor air environments and energy savings if any. Design/methodology/approach – Most of the modeling techniques for assessing wind catchers performance are theoretical. Post-occupancy evaluation studies of buildings will provide an insight into the operation of these building components and help to inform facilities managers. A case study for POE was presented in this paper. Findings – The monitoring of the summer and winter month operations showed that the indoor air quality parameters were kept within the design target range. The design control strategy failed to record data regarding the operation, opening time and position of wind catchers system. Though the implemented control strategy was working effectively in monitoring the operation of mechanical ventilation systems, i.e. AHU, did not integrate the wind catchers with the mechanical ventilation system. Research limitations/implications – Owing to short-falls in the control strategy implemented in this project, it was found difficult to quantify and verify the contribution of the wind catchers to the internal conditions and, hence, energy savings. Practical implications – Controlling the operation of the wind catchers via the AHU will lead to isolation of the wind catchers in the event of malfunctioning of the AHU. Wind catchers will contribute to the ventilation of space, particularly in the summer months. Originality/value – This paper demonstrates the value of POE as indispensable tool for FM professionals. It further provides insight into the application of natural ventilation systems in building for healthier indoor environments at lower energy cost. The design of the control strategy for natural ventilation and air-conditioning should be considered at the design stage involving the FM personnel.
Resumo:
Purpose – The purpose of this paper is to focus on the Fédération Internationale des Ingénieurs-Conseils (FIDIC) White Book standard form of building contract. It tracks the changes to this contract over its four editions, and seeks to identify their underlying causes. Design/methodology/approach – The changes made to the White Book are quantified using a specific type of quantitative content analysis. The amended clauses are then examined to understand the nature of the changes made. Findings – The length of the contract increased by 34 per cent between 1990 and 2006. A large proportion of the overall increase can be attributed to the clauses dealing with “conflict of interest/corruption” and “dispute resolution”. In both instances, the FIDIC drafting committees have responded to international developments to discourage corruption, and to encourage the use of alternative dispute resolution. Between 1998 and 2006, the average length of the sentences increased slightly, raising the question of whether long sentences are easily understood by users of contracts. Research limitations/implications – Quantification of text appears to be particularly useful for the analysis of documents which are regularly updated because changes can be clearly identified and the length of sentences can be determined, leading to conclusions about the readability of the text. However, caution is needed because changes of great relevance can be made to contract clauses without actually affecting their length. Practical implications – The paper will be instructive for contract drafters and informative for users of FIDIC's White Book. Originality/value – Quantifying text has been rarely used regarding standard-form contracts in the field of construction.
Resumo:
Purpose – At the heart of knowledge management (KM) are the people – an organisation's important knowledge asset. Although this is widely acknowledged, businesses seldom understand this axiom in terms of the communities through which individuals develop and share the capacity to create and use knowledge. It is the collective learning that takes place within the social systems, i.e. communities of practice (CoP) that are of particular significance to an organisation from a KM perspective. This paper aims to review, critique, and raise some pertinent questions on the role of CoPs; and with the help of case studies shed light on the “goings-on” in construction practices. Design/methodology/approach – After critically reviewing the literature on CoPs and querying some underlying assertions, this research investigates how these issues are addressed in practice. A case study approach is adopted. Three organisations operating in the construction sector are interviewed for the purpose of this paper. Findings – Case study findings highlight the potential challenges and benefits of CoPs to a construction organisation, the role they play in generating and delivering value to the organisation and their contribution towards the collective organisational intelligence. From the findings, it is clear that the question is not whether communities exist within organisations, but how they deliver value to the organisation. From an organisational perspective, the key challenge is to provide an environment that is conducive to developing and nurturing such communities as opposed to merely creating them. Practical implications – Challenges and benefits demonstrated through the case studies should be taken in context. The findings are not intended to be prescriptive in nature, but are intentionally descriptive to provide contextual data that allow readers to draw their own inferences in the context of their organisations. They should be applied taking into account an organisation's unique characteristics and differentiators, the dynamics of the environment in which it operates and the culture it harbours within. Originality/value – Investigating the role of CoPs in the context of case study construction organisations forms the prime focus of this paper.
Resumo:
Objective: To describe the calculations and approaches used to design experimental diets of differing saturated fatty acid (SFA) and monounsaturated fatty acid (MUFA) compositions for use in a long-term dietary intervention study, and to evaluate the degree to which the dietary targets were met. Design, setting and subjects: Fifty-one students living in a university hall of residence consumed a reference (SFA) diet for 8 weeks followed by either a moderate MUFA (MM) diet or a high MUFA (HM) diet for 16 weeks. The three diets were designed to differ only in their proportions of SFA and MUFA, while keeping total fat, polyunsaturated fatty acids (PUFA), trans-fatty acids, and the ratio of palmitic to stearic acid, and n-6 to n-3 PUFA, unchanged. Results: Using habitual diet records and a standardised database for food fatty acid compositions, a sequential process of theoretical fat substitutions enabled suitable fat sources for use in the three diets to be identified, and experimental margarines for baking, spreading and the manufacture of snack foods to be designed. The dietary intervention was largely successful in achieving the fatty acid targets of the three diets, although unintended differences between the original target and the analysed fatty acid composition of the experimental margarines resulted in a lower than anticipated MUFA intake on the HM diet, and a lower ratio of palmitic to stearic acid compared with the reference or MM diet. Conclusions: This study has revealed important theoretical considerations that should be taken into account when designing diets of specific fatty acid composition, as well as practical issues of implementation.
Resumo:
Event-related functional magnetic resonance imaging (efMRI) has emerged as a powerful technique for detecting brains' responses to presented stimuli. A primary goal in efMRI data analysis is to estimate the Hemodynamic Response Function (HRF) and to locate activated regions in human brains when specific tasks are performed. This paper develops new methodologies that are important improvements not only to parametric but also to nonparametric estimation and hypothesis testing of the HRF. First, an effective and computationally fast scheme for estimating the error covariance matrix for efMRI is proposed. Second, methodologies for estimation and hypothesis testing of the HRF are developed. Simulations support the effectiveness of our proposed methods. When applied to an efMRI dataset from an emotional control study, our method reveals more meaningful findings than the popular methods offered by AFNI and FSL. (C) 2008 Elsevier B.V. All rights reserved.