944 resultados para , Design Experiment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Acquiring details of kinetic parameters of enzymes is crucial to biochemical understanding, drug development, and clinical diagnosis in ocular diseases. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. Methods: We have developed Bayesian utility functions to minimise kinetic parameter variance involving differentiation of model expressions and matrix inversion. These have been applied to the simple kinetics of the enzymes in the glyoxalase pathway (of importance in posttranslational modification of proteins in cataract), and the complex kinetics of lens aldehyde dehydrogenase (also of relevance to cataract). Results: Our successful application of Bayesian statistics has allowed us to identify a set of rules for designing optimum kinetic experiments iteratively. Most importantly, the distribution of points in the range is critical; it is not simply a matter of even or multiple increases. At least 60 % must be below the KM (or plural if more than one dissociation constant) and 40% above. This choice halves the variance found using a simple even spread across the range.With both the glyoxalase system and lens aldehyde dehydrogenase we have significantly improved the variance of kinetic parameter estimation while reducing the number and costs of experiments. Conclusions: We have developed an optimal and iterative method for selecting features of design such as substrate range, number of measurements and choice of intermediate points. Our novel approach minimises parameter error and costs, and maximises experimental efficiency. It is applicable to many areas of ocular drug design, including receptor-ligand binding and immunoglobulin binding, and should be an important tool in ocular drug discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In areas such as drug development, clinical diagnosis and biotechnology research, acquiring details about the kinetic parameters of enzymes is crucial. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. We demonstrate that a Bayesian approach (the use of prior knowledge) can produce major gains quantifiable in terms of information, productivity and accuracy of each experiment. Developing the use of Bayesian Utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has enabled the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-M and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK industry has been criticised for being slow to adopt construction process innovations. Research shows that the idiosyncrasies of participants, their roles in the system and the contextual differences between sections of the industry make this a highly complex problem. There is considerable evidence that informal social networks play a key role in diffusion of innovations. The aim is to identify informal communication networks of project participants and the role these play in the diffusion of construction innovations. The characteristics of this network will be analysed in order to understand how they can be used to accelerate innovation diffusion within and between projects. Social Network Analysis is used to determine informal communication routes. Control and experiment case study projects are used within two different organizations. This allows informal communication routes concerning innovations to be mapped, whilst testing if the informal routes can facilitate diffusion. Analysis will focus upon understanding the combination of informal strong and weak ties, and how these impede or facilitate the diffusion of the innovation. Initial work suggests the presence of an informal communication network. Actors within this informal network, and the organization's management are unaware of its' existence and their informal roles within it. Thus, the network remains an untapped medium regarding innovation diffusion. It is proposed that successful innovation diffusion is dependent upon understanding informal strong and weak ties, at project, organization and industry level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. People with intellectual disabilities (ID) experience similar or even higher rates of mental health problems than the general population and there is a need to develop appropriate treatments. Cognitive behaviour therapy (CBT) is effective for a wide range of disorders in the general population. However, there is some evidence that people with ID may lack the cognitive skills needed to take part in CBT. Aims. To test if people with ID can learn skills required for CBT, specifically the ability to distinguish between thoughts, feelings, and behaviours and to link thoughts and feelings (cognitive mediation). Method. A randomized independent groups design was used to examine the effect of training in CBT on two tasks measuring CBT skills. Thirty-four adults with ID were randomly allocated to the experimental condition ðN ¼ 18Þ or to the control condition ðN ¼ 16Þ. CBT skills were assessed blind at baseline and after the intervention. Results. The training led to significant improvements in participants’ ability to link thoughts and feelings, and this skill was generalized to new material. There was no effect of training on participants’ ability to distinguish amongst thoughts, feelings, and behaviours. People with ID can, therefore, learn some skills required for CBT. This implies that preparatory training for CBT might be useful for people with ID. The results might be applicable to other groups who find aspects of CBT difficult.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers ways that experimental design can affect judgments about informally presented context shifting experiments. Reasons are given to think that judgments about informal context shifting experiments are affected by an exclusive reliance on binary truth value judgments and by experimenter bias. Exclusive reliance on binary truth value judgments may produce experimental artifacts by obscuring important differences of degree between the phenomena being investigated. Experimenter bias is an effect generated when, for example, experimenters disclose (even unconsciously) their own beliefs about the outcome of an experiment. Eliminating experimenter bias from context shifting experiments makes it far less obvious what the “intuitive” responses to those experiments are. After it is shown how those different kinds of bias can affect judgments about informal context shifting experiments, those experiments are revised to control for those forms of bias. The upshot of these investigations is that participants in the contextualist debate who employ informal experiments should pay just as much attention to the design of their experiments as those who employ more formal experimental techniques if they want to avoid obscuring the phenomena they aim to uncover

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Investors are now able to analyse more noise-free news to inform their trading decisions than ever before. Their expectation that more information means better performance is not supported by previous psychological experiments which argue that too much information actually impairs performance. The purpose of this paper is to examine whether the degree of information explicitness improves stock market performance. Design/methodology/approach – An experiment is conducted in a computer laboratory to examine a trading simulation manipulated from a real market-shock. Participants’ performance efficiency and effectiveness are measured separately. Findings – The results indicate that the explicitness of information neither improves nor impairs participants’ performance effectiveness from the perspectives of returns, share and cash positions, and trading volumes. However, participants’ performance efficiency is significantly affected by information explicitness. Originality/value – The novel approach and findings of this research add to the knowledge of the impact of information explicitness on the quality of decision making in a financial market environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The REgents PARk and Tower Environmental Experiment (REPARTEE) comprised two campaigns in London in October 2006 and October/November 2007. The experiment design involved measurements at a heavily trafficked roadside site, two urban background sites and an elevated site at 160–190 m above ground on the BT Tower, supplemented in the second campaign by Doppler lidar measurements of atmospheric vertical structure. A wide range of measurements of airborne particle physical metrics and chemical composition were made as well as measurements of a considerable range of gas phase species and the fluxes of both particulate and gas phase substances. Significant findings include (a) demonstration of the evaporation of traffic-generated nanoparticles during both horizontal and vertical atmospheric transport; (b) generation of a large base of information on the fluxes of nanoparticles, accumulation mode particles and specific chemical components of the aerosol and a range of gas phase species, as well as the elucidation of key processes and comparison with emissions inventories; (c) quantification of vertical gradients in selected aerosol and trace gas species which has demonstrated the important role of regional transport in influencing concentrations of sulphate, nitrate and secondary organic compounds within the atmosphere of London; (d) generation of new data on the atmospheric structure and turbulence above London, including the estimation of mixed layer depths; (e) provision of new data on trace gas dispersion in the urban atmosphere through the release of purposeful tracers; (f) the determination of spatial differences in aerosol particle size distributions and their interpretation in terms of sources and physico-chemical transformations; (g) studies of the nocturnal oxidation of nitrogen oxides and of the diurnal behaviour of nitrate aerosol in the urban atmosphere, and (h) new information on the chemical composition and source apportionment of particulate matter size fractions in the atmosphere of London derived both from bulk chemical analysis and aerosol mass spectrometry with two instrument types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The typographical naivety of much scientific legibility research has caused designers to question the value of the research and the results. Examining the reasons underlying this questioning, the paper discusses the importance of designers being more accepting of scientific findings, and why legibility investigations have value. To demonstrate how typographic knowledge can be incorporated into the design of studies to increase their validity, the paper reports on a new investigation into the role of serifs when viewed at a distance. The experiment looks into the identification of the lowercase letters ‘j’, ‘i’, ‘l’, ‘b’, ‘h’, ‘n’, ‘u’, and ‘a’ in isolation. All of the letters originate in the same typeface and are presented in one version with serifs and one version without serifs. Although the experiment found no overall legibility difference between the sans serif and the serif versions, the study showed that letters with serifs placed on the vertical extremes were more legible at a distance than the same letters in a sans serif. These findings can therefore provide specific guidance on the design of individual letters and demonstrate the product of collaboration between designer and scientist on the planning, implementation, and analysis of the study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: The aim was to examine whether specific skills required for cognitive behavioural therapy (CBT) could be taught using a computerised training paradigm with people who have intellectual disabilities (IDs). Training aimed to improve: a) ability to link pairs of situations and mediating beliefs to emotions, and b) ability to link pairs of situations and emotions to mediating beliefs. Method: Using a single-blind mixed experimental design, sixty-five participants with IDs were randomised to receive either computerised training or an attention-control condition. Cognitive mediation skills were assessed before and after training. Results: Participants who received training were significantly better at selecting appropriate emotions within situation beliefs pairs, controlling for baseline scores and IQ. Despite significant improvements in the ability of those who received training to correctly select intermediating beliefs for situation-feelings pairings, no between-group differences were observed at post-test. Conclusions: The findings indicated that computerised training led to a significant improvement in some aspects of cognitive mediation for people with IDs, but whether this has a positive effect upon outcome from therapy is yet to be established. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims Training has been shown to improve the ability of people with intellectual disabilities (IDs) to perform some cognitive behavioural therapy (CBT) tasks. This study used a computerised training paradigm with the aim of improving the ability of people with IDs to: a) discriminate between behaviours, thoughts and feelings, and b) link situations, thoughts and feelings. Methods Fifty-five people with mild-to-moderate IDs were randomly assigned to a training or attention-control condition in a single-blind mixed experimental design. Computerised tasks assessed the participants’ skills in: (a) discriminating between behaviours, thoughts and feelings (separately and pooled together), and (b) cognitive mediation by selecting appropriate emotions as consequences to given thoughts, and appropriate thoughts as mediators of given emotions. Results Training significantly improved ability to discriminate between behaviours, thoughts and feelings pooled together, compared to the attention-control condition, even when controlling for baseline scores and IQ. Large within-group improvements in the ability to identify behaviours and feelings were observed for the training condition, but not the attention-control group. There were no significant between-group differences in ability to identify thoughts, or on cognitive mediation skills. Conclusions A single session of computerised training can improve the ability of people with IDs to understand and practise CBT tasks relating to behaviours and feelings. There is potential for computerised training to be used as a “primer” for CBT with people with IDs to improve engagement and outcomes, but further development on a specific computerised cognitive mediation task is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Main Injector Neutrino Oscillation Search (MINOS) experiment uses an accelerator-produced neutrino beam to perform precision measurements of the neutrino oscillation parameters in the ""atmospheric neutrino"" sector associated with muon neutrino disappearance. This long-baseline experiment measures neutrino interactions in Fermilab`s NuMI neutrino beam with a near detector at Fermilab and again 735 km downstream with a far detector in the Soudan Underground Laboratory in northern Minnesota. The two detectors are magnetized steel-scintillator tracking calorimeters. They are designed to be as similar as possible in order to ensure that differences in detector response have minimal impact on the comparisons of event rates, energy spectra and topologies that are essential to MINOS measurements of oscillation parameters. The design, construction, calibration and performance of the far and near detectors are described in this paper. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this project, two broad facets in the design of a methodology for performance optimization of indexable carbide inserts were examined. They were physical destructive testing and software simulation.For the physical testing, statistical research techniques were used for the design of the methodology. A five step method which began with Problem definition, through System identification, Statistical model formation, Data collection and Statistical analyses and results was indepthly elaborated upon. Set-up and execution of an experiment with a compression machine together with roadblocks and possible solution to curb road blocks to quality data collection were examined. 2k factorial design was illustrated and recommended for process improvement. Instances of first-order and second-order response surface analyses were encountered. In the case of curvature, test for curvature significance with center point analysis was recommended. Process optimization with method of steepest ascent and central composite design or process robustness studies of response surface analyses were also recommended.For the simulation test, AdvantEdge program was identified as the most used software for tool development. Challenges to the efficient application of this software were identified and possible solutions proposed. In conclusion, software simulation and physical testing were recommended to meet the objective of the project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Throughout the industrial processes of sheet metal manufacturing and refining, shear cutting is widely used for its speed and cost advantages over competing cutting methods. Industrial shears may include some force measurement possibilities, but the force is most likely influenced by friction losses between shear tool and the point of measurement, and are in general not showing the actual force applied to the sheet. Well defined shears and accurate measurements of force and shear tool position are important for understanding the influence of shear parameters. Accurate experimental data are also necessary for calibration of numerical shear models. Here, a dedicated laboratory set-up with well defined geometry and movement in the shear, and high measurability in terms of force and geometry is designed, built and verified. Parameters important to the shear process are studied with perturbation analysis techniques and requirements on input parameter accuracy are formulated to meet experimental output demands. Input parameters in shearing are mostly geometric parameters, but also material properties and contact conditions. Based on the accuracy requirements, a symmetric experiment with internal balancing of forces is constructed to avoid guides and corresponding friction losses. Finally, the experimental procedure is validated through shearing of a medium grade steel. With the obtained experimental set-up performance, force changes as result of changes in studied input parameters are distinguishable down to a level of 1%.