78 resultados para design of experiment methodology
em CentAUR: Central Archive University of Reading - UK
Resumo:
The use of economic incentives for biodiversity (mostly Compensation and Reward for Environmental Services including Payment for ES) has been widely supported in the past decades and became the main innovative policy tools for biodiversity conservation worldwide. These policy tools are often based on the insight that rational actors perfectly weigh the costs and benefits of adopting certain behaviors and well-crafted economic incentives and disincentives will lead to socially desirable development scenarios. This rationalist mode of thought has provided interesting insights and results, but it also misestimates the context by which ‘real individuals’ come to decisions, and the multitude of factors influencing development sequences. In this study, our goal is to examine how these policies can take advantage of some unintended behavioral reactions that might in return impact, either positively or negatively, general policy performances. We test the effect of income's origin (‘Low effort’ based money vs. ‘High effort’ based money) on spending decisions (Necessity vs. Superior goods) and subsequent pro social preferences (Future pro-environmental behavior) within Madagascar rural areas, using a natural field experiment. Our results show that money obtained under low effort leads to different consumption patterns than money obtained under high efforts: superior goods are more salient in the case of low effort money. In parallel, money obtained under low effort leads to subsequent higher pro social behavior. Compensation and rewards policies for ecosystem services may mobilize knowledge on behavioral biases to improve their design and foster positive spillovers on their development goals.
Resumo:
The uptake of metals by earthworms occurs predominantly via the soil pore water, or via an uptake route which is related to the soil pore water metal concentration. However, it has been suggested that the speciation of the metal is also important. A novel technique is described which exposes Eisenia andrei Bouche to contaminant bearing solutions in which the chemical factors affecting its speciation may be individually and systematically manipulated. In a preliminary experiment, the LC50 for copper nitrate was 0.046 mg l(-1) (95 % confidence intervals: 0.03 and 0.07 mg l(-1)). There was a significant positive correlation between earthworm mortality and bulk copper concentration in solution (R-2 = 0.88, P less than or equal to 0.001), and a significant positive increase in earthworm tissue copper concentration with increasing copper concentration in solution (R-2 = 0.97, P less than or equal to 0.001). It is anticipated that quantifying the effect of soil solution chemical speciation on copper bioavailability will provide an excellent aid to understanding the importance of chemical composition and the speciation of metals, in the calculation of toxicological parameters.
Resumo:
Purpose: This paper aims to design an evaluation method that enables an organization to assess its current IT landscape and provide readiness assessment prior to Software as a Service (SaaS) adoption. Design/methodology/approach: The research employs a mixed of quantitative and qualitative approaches for conducting an IT application assessment. Quantitative data such as end user’s feedback on the IT applications contribute to the technical impact on efficiency and productivity. Qualitative data such as business domain, business services and IT application cost drivers are used to determine the business value of the IT applications in an organization. Findings: The assessment of IT applications leads to decisions on suitability of each IT application that can be migrated to cloud environment. Research limitations/implications: The evaluation of how a particular IT application impacts on a business service is done based on the logical interpretation. Data mining method is suggested in order to derive the patterns of the IT application capabilities. Practical implications: This method has been applied in a local council in UK. This helps the council to decide the future status of the IT applications for cost saving purpose.
Resumo:
This paper aims to assess the necessity of updating the intensity-duration-frequency (IDF) curves used in Portugal to design building storm-water drainage systems. A comparative analysis of the design was performed for the three predefined rainfall regions in Portugal using the IDF curves currently in use and estimated for future decades. Data for recent and future climate conditions simulated by a global and regional climate model chain are used to estimate possible changes of rainfall extremes and its implications for the drainage systems. The methodology includes the disaggregation of precipitation up to subhourly scales, the robust development of IDF curves, and the correction of model bias. Obtained results indicate that projected changes are largest for the plains in southern Portugal (5–33%) than for mountainous regions (3–9%) and that these trends are consistent with projected changes in the long-term 95th percentile of the daily precipitation throughout the 21st century. The authors conclude there is a need to review the current precipitation regime classification and change the new drainage systems towards larger dimensions to mitigate the projected changes in extreme precipitation.
Resumo:
This paper is an initial work towards developing an e-Government benchmarking model that is user-centric. To achieve the goal then, public service delivery is discussed first including the transition to online public service delivery and the need for providing public services using electronic media. Two major e-Government benchmarking methods are critically discussed and the need to develop a standardized benchmarking model that is user-centric is presented. To properly articulate user requirements in service provision, an organizational semiotic method is suggested.
Resumo:
Active queue management (AQM) policies are those policies of router queue management that allow for the detection of network congestion, the notification of such occurrences to the hosts on the network borders, and the adoption of a suitable control policy. This paper proposes the adoption of a fuzzy proportional integral (FPI) controller as an active queue manager for Internet routers. The analytical design of the proposed FPI controller is carried out in analogy with a proportional integral (PI) controller, which recently has been proposed for AQM. A genetic algorithm is proposed for tuning of the FPI controller parameters with respect to optimal disturbance rejection. In the paper the FPI controller design metodology is described and the results of the comparison with random early detection (RED), tail drop, and PI controller are presented.
Resumo:
Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)
Resumo:
This paper explores the theoretical developments and subsequent uptake of sequential methodology in clinical studies in the 25 years since Statistics in Medicine was launched. The review examines the contributions which have been made to all four phases into which clinical trials are traditionally classified and highlights major statistical advancements, together with assessing application of the techniques. The vast majority of work has been in the setting of phase III clinical trials and so emphasis will be placed here. Finally, comments are given indicating how the subject area may develop in the future.
Resumo:
Single crystal X-ray diffraction studies reveal that the incorporation of meta-amino benzoic acid in the middle of a helix forming hexapeptide sequence such as in peptide I Boc-Ile(1)-Aib(2)-Val(3)-m-ABA(4)-Ile(5)-Aib(6)-Leu(7)-OMe (Aib: alpha-amino isobutyric acid: m-ABA: meta-amino benzoic acid) breaks the helix propagation to produce a turn-linker-turn (T-L-T) foldamer in the solid state. In the crystalline state two conformational isomers of peptide I self-assemble in antiparallel fashion through intermolecular hydrogen bonds and aromatic pi-pi interactions to form a molecular duplex. The duplexes are further interconnected through intermolecular hydrogen bonds to form a layer of peptides. The layers are stacked one on top of the other through van der Waals interactions to form hydrophilic channels filled with solvent methanol. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Single crystal X-ray diffraction studies and solvent dependent NMR titration reveal that the designed pepticles I and 11, Boc-Xx(1)-Aib(2)-Yy(3)-NH(CH2)(2)NH-Yy(3)-Aib(2)-Xx(1)-Boc, where Xx and Yy are lie and Leu in peptide I and Leu and Val in peptide 11, respectively, fold into a turn-linker-turn (T-L-T) conformation both in the solid state and in solution. In the crystalline state the T-L-T foldamers; of peptide I and II self-assemble to form a three-dimensional framework of channels. The insides of the channels are hydrophilic and found to contain solvent CHCl3 hydrogen bonded to exposed C=O of Aib located at the turn regions. (c) 2008 Elsevier B.V. All rights reserved.