972 resultados para Experimental Problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Depression in childhood or adolescence is associated with increased rates of depression in adulthood. Does this justify efforts to detect (and treat) those with symptoms of depression in early childhood or adolescence? The aim of this study was to determine how well symptoms of anxiety/depression (A-D) in early childhood and adolescence predict adult mental health. The study sample is taken from a population-based prospective birth cohort study. Of the 8556 mothers initially approached to participate 8458 agreed, of whom 7223 mothers gave birth to a live singleton baby. Children were screened using modified Child Behaviour Checklist (CBCL) scales for internalizing and total problems (T-P) at age 5 and the CBCL and Youth Self Report (YSR) A-D subscale and T-P scale at age 14. At age 21, a sub-sample of 2563 young adults in this cohort were administered the CIDI-Auto. Results indicated that screening at age 5 would detect few later cases of significant mental ill-health. Using a cut-point of 20% for internalizing at child age 5 years the CBCL had sensitivities of only 25% and 18% for major depression and anxiety disorders at 21 years, respectively. At age 14, the YSR generally performed a little better than the CBCL as a screening instrument, but neither performed at a satisfactory level. Of the children who were categorised as having YSR A-D at 14 years 30% and 37% met DSM-IV criteria for major depression and anxiety disorders, respectively, at age 21. Our findings challenge an existing movement encouraging the detection and treatment of those with symptoms of mental illness in early childhood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the real world there are many problems in network of networks (NoNs) that can be abstracted to a so-called minimum interconnection cut problem, which is fundamentally different from those classical minimum cut problems in graph theory. Thus, it is desirable to propose an efficient and effective algorithm for the minimum interconnection cut problem. In this paper we formulate the problem in graph theory, transform it into a multi-objective and multi-constraint combinatorial optimization problem, and propose a hybrid genetic algorithm (HGA) for the problem. The HGA is a penalty-based genetic algorithm (GA) that incorporates an effective heuristic procedure to locally optimize the individuals in the population of the GA. The HGA has been implemented and evaluated by experiments. Experimental results have shown that the HGA is effective and efficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we investigate experimentally whether people search optimally and how price promotions influence search behaviour. We implement a sequential search task with exogenous price dispersion in a baseline treatment and introduce discounts in two experimental treatments. We find that search behaviour is roughly consistent with optimal search but also observe some discount biases. If subjects do not know in advance where discounts are offered, the purchase probability is increased by 19 percentage points in shops with discounts, even after controlling for the benefit of the discount and for risk preferences. If consumers know in advance where discounts are given, then the bias is only weakly significant and much smaller (7 percentage points).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis provides an experimental and computational platform for investigating the performance and behaviour of water filled, plastic portable road safety barriers in an isolated impact scenario. A schedule of experimental impact tests were conducted assessing the impact response of an existing design of road safety barrier utilising a novel horizontal impact testing system. A coupled finite element and smooth particle hydrodynamic model of the barrier system was developed and validated against the results of the experimental tests. The validated model was subsequently used to assess the effect of certain composite materials on the impact performance of the water filled, portable road safety barrier system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, problems are described which are related to the ergonomic assessment of vehicle package design in vehicle systems engineering. The traditional approach, using questionnaire techniques for a subjective assessment of comfort related to package design, is compared to a biomechanical approach. An example is given for ingress design. The biomechanical approach is based upon objective postural data. The experimental setup for the study is described and methods used for the biomechanical analysis are explained. Because the biomechanic assessment requires not only a complex experimental setup but also time consuming data processing, a systematic reduction and preparation of biomechanic data for classification with an Artificial Neural Network significantly improves the economy of the biomechanical method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to economic and demographic changes highly educated women play an important role on the Chinese labour market. Gender has been shown to be an important characteristic that influences behaviour in economic experiments, as have, to a lesser degree, academic major, age and income. We provide a study looking at trust and reciprocity and their determinants in a labour market laboratory experiment. Our experimental data is based on two games, the Gift Exchange Game (GEG) and a variant of this game (the Wage Promising Game, WPG) where the employer's wage offer is non-binding and the employer can choose the wage freely after observing the workers effort. We and that women are less trusting and reciprocal than men in the GEG while this cannot be found in the WPG. Letting participants play the GEG and the WPG, allows us to disentangle reciprocal and risk attitudes. While in the employer role, it seems to be that risk attitude is the main factor, this is not confirmed analysing decisions in the worker role.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Credence goods markets suffer from inefficiencies caused by superior information of sellers about the surplus-maximizing quality. While standard theory predicts that equal mark-up prices solve the credence goods problem if customers can verify the quality received, experimental evidence indicates the opposite. We identify a lack of robustness of institutional design with respect to heterogeneity in distributional preferences as a possible cause and design new experiments that allow for parsimonious identification of sellers’ distributional types. Our results indicate that less than a fourth of the subjects behave according to standard theory’s assumption, the rest behaving either in line with non-standard selfish or in accordance with non-trivial other-regarding preferences. We discuss consequences of our findings for institutional design and agent selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an increasing interest in the use of information technology as a participatory planning tool, particularly the use of geographical information technologies to support collaborative activities such as community mapping. However, despite their promise, the introduction of such technologies does not necessarily promote better participation nor improve collaboration. In part this can be attributed to a tendency for planners to focus on the technical considerations associated with these technologies at the expense of broader participation considerations. In this paper we draw on the experiences of a community mapping project with disadvantaged communities in suburban Australia to highlight the importance of selecting tools and techniques which support and enhance participatory planning. This community mapping project, designed to identify and document community-generated transport issues and solutions, had originally intended to use cadastral maps extracted from the government’s digital cadastral database as the foundation for its community mapping approach. It was quickly discovered that the local residents found the cadastral maps confusing as the maps lacked sufficient detail to orient them to their suburb (the study area). In response to these concerns and consistent with the project’s participatory framework, a conceptual base map based on resident’s views of landmarks of local importance was developed to support the community mapping process. Based on this community mapping experience we outline four key lessons learned regarding the process of community mapping and the place of geographical information technologies within this process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Portable water-filled road barriers (PWFB) are roadside structures placed on temporary construction zones to separate work site from moving traffic. Recent changes in governing standards require PWFB to adhere to strict compliance in terms of lateral displacement of the road barriers and vehicle redirectionality. Actual road safety barrier test can be very costly, thus researchers resort to Finite Element Analysis (FEA) in the initial designs phase prior to real vehicle test. There has been many research conducted on concrete barriers and flexible steel barriers using FEA, however not many is done pertaining to PWFB. This research probes a new method to model joint mechanism in PWFB. Two methods to model the joining mechanism are presented and discussed in relation to its practicality and accuracy to real work applications. Moreover, the study of the physical gap and mass of the barrier was investigated. Outcome from this research will benefit PWFB research and allow road barrier designers better knowledge in developing the next generation of road safety structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of immobilised TiO2 for the purification of polluted water streams introduces the necessity to evaluate the effect of mechanisms such as the transport of pollutants from the bulk of the liquid to the catalyst surface and the transport phenomena inside the porous film. Experimental results of the effects of film thickness on the observed reaction rate for both liquid-side and support-side illumination are here compared with the predictions of a one-dimensional mathematical model of the porous photocatalytic slab. Good agreement was observed between the experimentally obtained photodegradation of phenol and its by-products, and the corresponding model predictions. The results have confirmed that an optimal catalyst thickness exists and, for the films employed here, is 5 μm. Furthermore, the modelling results have highlighted the fact that porosity, together with the intrinsic reaction kinetics are the parameters controlling the photocatalytic activity of the film. The former by influencing transport phenomena and light absorption characteristics, the latter by naturally dictating the rate of reaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the communicative turn in policy-making has encouraged the public deliberation of policy decisions it has arguably had a more limited impact on the ability of public processes to deal with wicked problems. Wicked policy problems are characterised by high levels of complexity, uncertainty and divergence of values. However, some wicked problems present the additional challenge of high levels of psychosocial sensitivity and verbal proscription. Because these unspeakable policy problems frequently involve a significant moral dimension, the regulation of intimate processes or bodies, and strong elements of abjection and symbolic pollution they are quite literally problems that we don’t like to think about or talk about. However, the potential environmental and social impacts of these problems require that they be addressed. In this paper I present the preliminary findings of a research project focussed on the idea of the unspeakable policy problem and how its unspeakable nature can impact upon public participation and policy and environmental outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrostatic discharges have been identified as the most likely cause in a number of incidents of fire and explosion with unexplained ignitions. The lack of data and suitable models for this ignition mechanism creates a void in the analysis to quantify the importance of static electricity as a credible ignition mechanism. Quantifiable hazard analysis of the risk of ignition by static discharge cannot, therefore, be entirely carried out with our current understanding of this phenomenon. The study of electrostatics has been ongoing for a long time. However, it was not until the wide spread use of electronics that research was developed for the protection of electronics from electrostatic discharges. Current experimental models for electrostatic discharge developed for intrinsic safety with electronics are inadequate for ignition analysis and typically are not supported by theoretical analysis. A preliminary simulation and experiment with low voltage was designed to investigate the characteristics of energy dissipation and provided a basis for a high voltage investigation. It was seen that for a low voltage the discharge energy represents about 10% of the initial capacitive energy available and that the energy dissipation was within 10 ns of the initial discharge. The potential difference is greatest at the initial break down when the largest amount of the energy is dissipated. The discharge pathway is then established and minimal energy is dissipated as energy dissipation becomes greatly influenced by other components and stray resistance in the discharge circuit. From the initial low voltage simulation work, the importance of the energy dissipation and the characteristic of the discharge were determined. After the preliminary low voltage work was completed, a high voltage discharge experiment was designed and fabricated. Voltage and current measurement were recorded on the discharge circuit allowing the discharge characteristic to be recorded and energy dissipation in the discharge circuit calculated. Discharge energy calculations show consistency with the low voltage work relating to discharge energy with about 30-40% of the total initial capacitive energy being discharged in the resulting high voltage arc. After the system was characterised and operation validated, high voltage ignition energy measurements were conducted on a solution of n-Pentane evaporating in a 250 cm3 chamber. A series of ignition experiments were conducted to determine the minimum ignition energy of n-Pentane. The data from the ignition work was analysed with standard statistical regression methods for tests that return binary (yes/no) data and found to be in agreement with recent publications. The research demonstrates that energy dissipation is heavily dependent on the circuit configuration and most especially by the discharge circuit's capacitance and resistance. The analysis established a discharge profile for the discharges studied and validates the application of this methodology for further research into different materials and atmospheres; by systematically looking at discharge profiles of test materials with various parameters (e.g., capacitance, inductance, and resistance). Systematic experiments looking at the discharge characteristics of the spark will also help understand the way energy is dissipated in an electrostatic discharge enabling a better understanding of the ignition characteristics of materials in terms of energy and the dissipation of that energy in an electrostatic discharge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract An experimental dataset representing a typical flow field in a stormwater gross pollutant trap (GPT) was visualised. A technique was developed to apply the image-based flow visualisation (IBFV) algorithm to the raw dataset. Particle image velocimetry (PIV) software was previously used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding stormwater pollutant capture and retention behaviour within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate the possible flow paths of pollutants entering the GPT. The investigated flow paths were compared with the behaviour of pollutants monitored during experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future data set drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature to rapidly obtain samples from the posterior is importance sampling, using the prior as the importance distribution. However, importance sampling will tend to break down if there is a reasonable number of experimental observations and/or the model parameter is high dimensional. In this paper we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times which produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.