31 resultados para Multi-objective optimisation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: SPARCLE is a cross-sectional survey in nine European regions, examining the relationship of the environment of children with cerebral palsy to their participation and quality of life. The objective of this report is to assess data quality, in particular heterogeneity between regions, family and item non-response and potential for bias. Methods: 1,174 children aged 8–12 years were selected from eight population-based registers of children with cerebral palsy; one further centre recruited 75 children from multiple sources. Families were visited by trained researchers who administered psychometric questionnaires. Logistic regression was used to assess factors related to family non-response and self-completion of questionnaires by children. Results: 431/1,174 (37%) families identified from registers did not respond: 146 (12%) were not traced; of the 1,028 traced families, 250 (24%) declined to participate and 35 (3%) were not approached. Families whose disabled children could walk unaided were more likely to decline to participate. 818 children entered the study of which 500 (61%) self-reported their quality of life; children with low IQ, seizures or inability to walk were less likely to self-report. There was substantial heterogeneity between regions in response rates and socio-demographic characteristics of families but not in age or gender of children. Item non-response was 2% for children and ranged from 0.4% to 5% for questionnaires completed by parents. Conclusion: While the proportion of untraced families was higher than in similar surveys, the refusal rate was comparable. To reduce bias, all analyses should allow for region, walking ability, age and socio-demographic characteristics. The 75 children in the region without a population based register are unlikely to introduce bias

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new search-space-updating technique for genetic algorithms is proposed for continuous optimisation problems. Other than gradually reducing the search space during the evolution process with a fixed reduction rate set ‘a priori’, the upper and the lower boundaries for each variable in the objective function are dynamically adjusted based on its distribution statistics. To test the effectiveness, the technique is applied to a number of benchmark optimisation problems in comparison with three other techniques, namely the genetic algorithms with parameter space size adjustment (GAPSSA) technique [A.B. Djurišic, Elite genetic algorithms with adaptive mutations for solving continuous optimization problems – application to modeling of the optical constants of solids, Optics Communications 151 (1998) 147–159], successive zooming genetic algorithm (SZGA) [Y. Kwon, S. Kwon, S. Jin, J. Kim, Convergence enhanced genetic algorithm with successive zooming method for solving continuous optimization problems, Computers and Structures 81 (2003) 1715–1725] and a simple GA. The tests show that for well-posed problems, existing search space updating techniques perform well in terms of convergence speed and solution precision however, for some ill-posed problems these techniques are statistically inferior to a simple GA. All the tests show that the proposed new search space update technique is statistically superior to its counterparts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A problem with use of the geostatistical Kriging error for optimal sampling design is that the design does not adapt locally to the character of spatial variation. This is because a stationary variogram or covariance function is a parameter of the geostatistical model. The objective of this paper was to investigate the utility of non-stationary geostatistics for optimal sampling design. First, a contour data set of Wiltshire was split into 25 equal sub-regions and a local variogram was predicted for each. These variograms were fitted with models and the coefficients used in Kriging to select optimal sample spacings for each sub-region. Large differences existed between the designs for the whole region (based on the global variogram) and for the sub-regions (based on the local variograms). Second, a segmentation approach was used to divide a digital terrain model into separate segments. Segment-based variograms were predicted and fitted with models. Optimal sample spacings were then determined for the whole region and for the sub-regions. It was demonstrated that the global design was inadequate, grossly over-sampling some segments while under-sampling others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dapivirine mucoadhesive gels and freeze-dried tablets were prepared using a 3 x 3 x 2 factorial design. An artificial neural network (ANN) with multi-layer perception was used to investigate the effect of hydroxypropyl-methylcellulose (HPMC): polyvinylpyrrolidone (PVP) ratio (XI), mucoadhesive concentration (X2) and delivery system (gel or freeze-dried mucoadhesive tablet, X3) on response variables; cumulative release of dapivirine at 24 h (Q(24)), mucoadhesive force (F-max) and zero-rate viscosity. Optimisation was performed by minimising the error between the experimental and predicted values of responses by ANN. The method was validated using check point analysis by preparing six formulations of gels and their corresponding freeze-dried tablets randomly selected from within the design space of contour plots. Experimental and predicted values of response variables were not significantly different (p > 0.05, two-sided paired t-test). For gels, Q(24) values were higher than their corresponding freeze-dried tablets. F-max values for freeze-dried tablets were significantly different (2-4 times greater, p > 0.05, two-sided paired t-test) compared to equivalent gets. Freeze-dried tablets having lower values for X1 and higher values for X2 components offered the best compromise between effective dapivirine release, mucoadhesion and viscosity such that increased vaginal residence time was likely to be achieved. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research was to optimise the rheological parameters, hardened properties, and setting times of cement grouts containing metakaolin (MTK), viscosity-modifying agent (VMA) and superplasticiser (SP). All mixes were made with water-to-binder ratio (W/B) of 0.40. The replacement of cement by MTK was varied from 6% to 20% (by mass), and dosages of SP and VMA were varied from 0.3% to 1.4%, and 0.01% and 0.06% (by mass of binder), respectively. Increased SP led to an increase in fluidity, reduction in flow time, plate cohesion, rheological parameters, and an increase in the setting times. Increased VMA demonstrated a reduction in fluidity, an increase in Marsh cone time, plate cohesion, yield stress, and plastic viscosity. Results indicate that the use of MTK increased yield stress, plastic viscosity, cohesion plate, and flow time due to the higher surface area associated with an increase in the water demand. MTK reduced mini-slump and setting times, and improved compressive strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlorination of wheat flour in the EU countries has been replaced in recent years, to some extent, by heat treated flour which is used to produce high ratio cakes. Heat treated flour allows high ratio recipes to be developed which generate products with longer shelf life, finer texture, moist crumb and sweeter taste. The mechanism by which heat treatment improves the flour is not fully understood, but it is known that during the heat treatment process, protein denaturation and partial gelatinisation of the starch granules occurs, as well as an increase in batter viscosity. Therefore, it is important to optimize the flour heat treatment process, in order to enhance baking quality. Laboratory preparation of heat treated base wheat flour (culinary, soft, low protein) was carried out in a fluidised bed drier using a range of temperatures and times. The gluten was extracted from the final product and its quality was tested, to obtain objective and comparative information on the extent of protein denaturation. The results indicated that heat treatment of flour decreases gluten extensibility and partial gelatinisation of the starch granules occurred. After heat treatment the gluten appeared to retain moisture. The optimum time/temperature for the heat treatment of base flour was 120-130°C for 30 min with moisture content of ˜12.5%.© 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of a multi-band antenna consisting of a microstrip patch with two U-slots is designed and tested for use in aircraft cabin wireless access points. The objective of this paper is to evaluate this antenna that covers most of the current wireless bands from 1.7GHz to 5.85GHz.A specially designed wideband probe antenna is used for characterization
of field radiated from this antenna. This measurement setup gives room for future development like human presence in the cabin, the fading effects, and the path loss between transmitter and receiver.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the construction of linear-in-the-parameters (LITP) models for multi-output regression problems. Most existing stepwise forward algorithms choose the regressor terms one by one, each time maximizing the model error reduction ratio. The drawback is that such procedures cannot guarantee a sparse model, especially under highly noisy learning conditions. The main objective of this paper is to improve the sparsity and generalization capability of a model for multi-output regression problems, while reducing the computational complexity. This is achieved by proposing a novel multi-output two-stage locally regularized model construction (MTLRMC) method using the extreme learning machine (ELM). In this new algorithm, the nonlinear parameters in each term, such as the width of the Gaussian function and the power of a polynomial term, are firstly determined by the ELM. An initial multi-output LITP model is then generated according to the termination criteria in the first stage. The significance of each selected regressor is checked and the insignificant ones are replaced at the second stage. The proposed method can produce an optimized compact model by using the regularized parameters. Further, to reduce the computational complexity, a proper regression context is used to allow fast implementation of the proposed method. Simulation results confirm the effectiveness of the proposed technique. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bi-directional Evolutionary Structural Optimisation (BESO) method is a numerical topology optimisation method developed for use in finite element analysis. This paper presents a particular application of the BESO method to optimise the energy absorbing capability of metallic structures. The optimisation objective is to evolve a structural geometry of minimum mass while ensuring that the kinetic energy of an impacting projectile is reduced to a level which prevents perforation. Individual elements in a finite element mesh are deleted when a prescribed damage criterion is exceeded. An energy absorbing structure subjected to projectile impact will fail once the level of damage results in a critical perforation size. It is therefore necessary to constrain an optimisation algorithm from producing such candidate solutions. An algorithm to detect perforation was implemented within a BESO framework which incorporated a ductile material damage model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cycle of the academic year impacts on efforts to refine and improve major group design-build-test (DBT) projects since the time to run and evaluate projects is generally a full calendar year. By definition these major projects have a high degree of complexity since they act as the vehicle for the application of a range of technical knowledge and skills. There is also often an extensive list of desired learning outcomes which extends to include professional skills and attributes such as communication and team working. It is contended that student project definition and operation, like any other designed product, requires a number of iterations to achieve optimisation. The problem however is that if this cycle takes four or more years then by the time a project’s operational structure is fine tuned it is quite possible that the project theme is no longer relevant. The majority of the students will also inevitably experience a sub-optimal project experience over the 5 year development period. It would be much better if the ratio were flipped so that in 1 year an optimised project definition could be achieved which had sufficient longevity that it could run in the same efficient manner for 4 further years. An increased number of parallel investigators would also enable more varied and adventurous project concepts to be examined than a single institution could undertake alone in the same time frame.
This work-in-progress paper describes a parallel processing methodology for the accelerated definition of new student DBT project concepts. This methodology has been devised and implemented by a number of CDIO partner institutions in the UK & Ireland region. An agreed project theme was operated in parallel in one academic year with the objective of replacing a multi-year iterative cycle. Additionally the close collaboration and peer learning derived from the interaction between the coordinating academics facilitated the development of faculty teaching skills in line with CDIO standard 10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Human research ethics committees provide essential review of research projects to ensure the ethical conduct of human research. Several recent reports have highlighted a complex process for successful application for human research ethics committee approval, particularly for multi-centre studies. Limited resources are available for the execution of human clinical research in Australia and around the world.

METHODS: This report overviews the process of ethics approval for a National Health and Medical Research Council-funded multi-centre study in Australia, focussing on the time and resource implications of such applications in 2007 and 2008.

RESULTS: Applications were submitted to 16 hospital and two university human research ethics committees. The total time to gain final approval from each committee ranged between 13 and 77 days (median = 46 days); the entire process took 16 months to complete and the research officer's time was estimated to cost $A34 143.

CONCLUSIONS: Obstacles to timely human research ethics committee approval are reviewed, including recent, planned and potential initiatives that could improve the ethics approval of multi-centre research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proposed multi-table lookup architecture provides SDN-based, high-performance packet classification in an OpenFlow v1.1+ SDN switch. The objective of the demonstration is to show the functionality of the architecture deployed on the NetFPGA SUME Platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.

METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.

RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).

CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.

ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field programmable gate array (FPGA) technology is a powerful platform for implementing computationally complex, digital signal processing (DSP) systems. Applications that are multi-modal, however, are designed for worse case conditions. In this paper, genetic sequencing techniques are applied to give a more sophisticated decomposition of the algorithmic variations, thus allowing an unified hardware architecture which gives a 10-25% area saving and 15% power saving for a digital radar receiver.