247 resultados para D-optimal design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To compare the location and accessibility of current Australian chronic heart failure (CHF) management programs and general practice services with the probable distribution of the population with CHF. Design and setting: Data on the prevalence and distribution of the CHF population throughout Australia, and the locations of CHF management programs and general practice services from 1 January 2004 to 31 December 2005 were analysed using geographic information systems (GIS) technology. Outcome measures: Distance of populations with CHF to CHF management programs and general practice services. Results: The highest prevalence of CHF (20.3–79.8 per 1000 population) occurred in areas with high concentrations of people over 65 years of age and in areas with higher proportions of Indigenous people. Five thousand CHF patients (8%) discharged from hospital in 2004–2005 were managed in one of the 62 identified CHF management programs. There were no CHF management programs in the Northern Territory or Tasmania. Only four CHF management programs were located outside major cities, with a total case load of 80 patients (0.7%). The mean distance from any Australian population centre to the nearest CHF management program was 332 km (median, 163 km; range, 0.15–3246 km). In rural areas, where the burden of CHF management falls upon general practitioners, the mean distance to general practice services was 37 km (median, 20 km; range, 0–656 km). Conclusion: There is an inequity in the provision of CHF management programs to rural Australians.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the field programmable gate array (FPGA) approach for multi-objective and multi-disciplinary design optimisation (MDO) problems. One class of optimisation method that has been well-studied and established for large and complex problems, such as those inherited in MDO, is multi-objective evolutionary algorithms (MOEAs). The MOEA, nondominated sorting genetic algorithm II (NSGA-II), is hardware implemented on an FPGA chip. The NSGA-II on FPGA application to multi-objective test problem suites has verified the designed implementation effectiveness. Results show that NSGA-II on FPGA is three orders of magnitude better than the PC based counterpart.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the High Lift System (HLS) application of complex aerodynamic design problem using Particle Swarm Optimisation (PSO) coupled to Game strategies. Two types of optimization methods are used; the first method is a standard PSO based on Pareto dominance and the second method hybridises PSO with a well-known Nash Game strategies named Hybrid-PSO. These optimization techniques are coupled to a pre/post processor GiD providing unstructured meshes during the optimisation procedure and a transonic analysis software PUMI. The computational efficiency and quality design obtained by PSO and Hybrid-PSO are compared. The numerical results for the multi-objective HLS design optimisation clearly shows the benefits of hybridising a PSO with the Nash game and makes promising the above methodology for solving other more complex multi-physics optimisation problems in Aeronautics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of game strategies have been developed in past decades and used in the fields of economics, engineering, computer science, and biology due to their efficiency in solving design optimization problems. In addition, research in multiobjective and multidisciplinary design optimization has focused on developing a robust and efficient optimization method so it can produce a set of high quality solutions with less computational time. In this paper, two optimization techniques are considered; the first optimization method uses multifidelity hierarchical Pareto-optimality. The second optimization method uses the combination of game strategies Nash-equilibrium and Pareto-optimality. This paper shows how game strategies can be coupled to multiobjective evolutionary algorithms and robust design techniques to produce a set of high quality solutions. Numerical results obtained from both optimization methods are compared in terms of computational expense and model quality. The benefits of using Hybrid and non-Hybrid-Game strategies are demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many applications in aeronautical/aerospace engineering where some values of the design parameters states cannot be provided or determined accurately. These values can be related to the geometry(wingspan, length, angles) and or to operational flight conditions that vary due to the presence of uncertainty parameters (Mach, angle of attack, air density and temperature, etc.). These uncertainty design parameters cannot be ignored in engineering design and must be taken into the optimisation task to produce more realistic and reliable solutions. In this paper, a robust/uncertainty design method with statistical constraints is introduced to produce a set of reliable solutions which have high performance and low sensitivity. Robust design concept coupled with Multi Objective Evolutionary Algorithms (MOEAs) is defined by applying two statistical sampling formulas; mean and variance/standard deviation associated with the optimisation fitness/objective functions. The methodology is based on a canonical evolution strategy and incorporates the concepts of hierarchical topology, parallel computing and asynchronous evaluation. It is implemented for two practical Unmanned Aerial System (UAS) design problems; the flrst case considers robust multi-objective (single disciplinary: aerodynamics) design optimisation and the second considers a robust multidisciplinary (aero structures) design optimisation. Numerical results show that the solutions obtained by the robust design method with statistical constraints have a more reliable performance and sensitivity in both aerodynamics and structures when compared to the baseline design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of adaptive wing/aerofoil designs is being considered as promising techniques in aeronautic/aerospace since they can reduce aircraft emissions, improve aerodynamic performance of manned or unmanned aircraft. The paper investigates the robust design and optimisation for one type of adaptive techniques; Active Flow Control (AFC) bump at transonic flow conditions on a Natural Laminar Flow (NLF) aerofoil designed to increase aerodynamic efficiency (especially high lift to drag ratio). The concept of using Shock Control Bump (SCB) is to control supersonic flow on the suction/pressure side of NLF aerofoil: RAE 5243 that leads to delaying shock occurrence or weakening its strength. Such AFC technique reduces total drag at transonic speeds due to reduction of wave drag. The location of Boundary Layer Transition (BLT) can influence the position the supersonic shock occurrence. The BLT position is an uncertainty in aerodynamic design due to the many factors, such as surface contamination or surface erosion. The paper studies the SCB shape design optimisation using robust Evolutionary Algorithms (EAs) with uncertainty in BLT positions. The optimisation method is based on a canonical evolution strategy and incorporates the concepts of hierarchical topology, parallel computing and asynchronous evaluation. Two test cases are conducted; the first test assumes the BLT is at 45% of chord from the leading edge and the second test considers robust design optimisation for SCB at the variability of BLT positions and lift coefficient. Numerical result shows that the optimisation method coupled to uncertainty design techniques produces Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a lead project currently underway through Australia’s Sustainable Built Environment National Research Centre evaluating diffusion mechanisms and impacts of R&D investment in the Australian built environment. Through a retrospective analysis of R&D investment trends and industry outcomes, and a prospective assessment of industry futures using strategic foresighting, a future-focussed industry R&D roadmap and pursuant policy guidelines will be developed. This research aims to build new understandings and knowledge relevant to R&D funding strategies, research team formation and management, dissemination of outcomes and industry uptake. Each of these issues are critical due to: the disaggregated nature of the built environment industry; intense competition; limited R&D investment; and new challenges (e.g. IT, increased environmental expectations). This paper details the context within which this project is being undertaken and the research design. Findings of the retrospective analysis of past R&D investment in Australia will be presented at this conference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the implementation of time and energy efficient trajectories onto a test-bed autonomous underwater vehicle. The trajectories are losely connected to the results of the application of the maximum principle to the controlled mechanical system. We use a numerical algorithm to compute efficient trajectories designed using geometric control theory to optimize a given cost function. Experimental results are shown for the time minimization problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emerging from the challenge to reduce energy consumption in buildings is a need for research and development into the more effective use of simulation as a decision-support tool. Despite significant research, persistent limitations in process and software inhibit the integration of energy simulation in early architectural design. This paper presents a green star case study to highlight the obstacles commonly encountered with current integration strategies. It then examines simulation-based design in the aerospace industry, which has overcome similar limitations. Finally, it proposes a design system based on this contrasting approach, coupling parametric modelling and energy simulation software for rapid and iterative performance assessment of early design options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proteases regulate a spectrum of diverse physiological processes, and dysregulation of proteolytic activity drives a plethora of pathological conditions. Understanding protease function is essential to appreciating many aspects of normal physiology and progression of disease. Consequently, development of potent and specific inhibitors of proteolytic enzymes is vital to provide tools for the dissection of protease function in biological systems and for the treatment of diseases linked to aberrant proteolytic activity. The studies in this thesis describe the rational design of potent inhibitors of three proteases that are implicated in disease development. Additionally, key features of the interaction of proteases and their cognate inhibitors or substrates are analysed and a series of rational inhibitor design principles are expounded and tested. Rational design of protease inhibitors relies on a comprehensive understanding of protease structure and biochemistry. Analysis of known protease cleavage sites in proteins and peptides is a commonly used source of such information. However, model peptide substrate and protein sequences have widely differing levels of backbone constraint and hence can adopt highly divergent structures when binding to a protease’s active site. This may result in identical sequences in peptides and proteins having different conformations and diverse spatial distribution of amino acid functionalities. Regardless of this, protein and peptide cleavage sites are often regarded as being equivalent. One of the key findings in the following studies is a definitive demonstration of the lack of equivalence between these two classes of substrate and invalidation of the common practice of using the sequences of model peptide substrates to predict cleavage of proteins in vivo. Another important feature for protease substrate recognition is subsite cooperativity. This type of cooperativity is commonly referred to as protease or substrate binding subsite cooperativity and is distinct from allosteric cooperativity, where binding of a molecule distant from the protease active site affects the binding affinity of a substrate. Subsite cooperativity may be intramolecular where neighbouring residues in substrates are interacting, affecting the scissile bond’s susceptibility to protease cleavage. Subsite cooperativity can also be intermolecular where a particular residue’s contribution to binding affinity changes depending on the identity of neighbouring amino acids. Although numerous studies have identified subsite cooperativity effects, these findings are frequently ignored in investigations probing subsite selectivity by screening against diverse combinatorial libraries of peptides (positional scanning synthetic combinatorial library; PS-SCL). This strategy for determining cleavage specificity relies on the averaged rates of hydrolysis for an uncharacterised ensemble of peptide sequences, as opposed to the defined rate of hydrolysis of a known specific substrate. Further, since PS-SCL screens probe the preference of the various protease subsites independently, this method is inherently unable to detect subsite cooperativity. However, mean hydrolysis rates from PS-SCL screens are often interpreted as being comparable to those produced by single peptide cleavages. Before this study no large systematic evaluation had been made to determine the level of correlation between protease selectivity as predicted by screening against a library of combinatorial peptides and cleavage of individual peptides. This subject is specifically explored in the studies described here. In order to establish whether PS-SCL screens could accurately determine the substrate preferences of proteases, a systematic comparison of data from PS-SCLs with libraries containing individually synthesised peptides (sparse matrix library; SML) was carried out. These SML libraries were designed to include all possible sequence combinations of the residues that were suggested to be preferred by a protease using the PS-SCL method. SML screening against the three serine proteases kallikrein 4 (KLK4), kallikrein 14 (KLK14) and plasmin revealed highly preferred peptide substrates that could not have been deduced by PS-SCL screening alone. Comparing protease subsite preference profiles from screens of the two types of peptide libraries showed that the most preferred substrates were not detected by PS SCL screening as a consequence of intermolecular cooperativity being negated by the very nature of PS SCL screening. Sequences that are highly favoured as result of intermolecular cooperativity achieve optimal protease subsite occupancy, and thereby interact with very specific determinants of the protease. Identifying these substrate sequences is important since they may be used to produce potent and selective inhibitors of protolytic enzymes. This study found that highly favoured substrate sequences that relied on intermolecular cooperativity allowed for the production of potent inhibitors of KLK4, KLK14 and plasmin. Peptide aldehydes based on preferred plasmin sequences produced high affinity transition state analogue inhibitors for this protease. The most potent of these maintained specificity over plasma kallikrein (known to have a very similar substrate preference to plasmin). Furthermore, the efficiency of this inhibitor in blocking fibrinolysis in vitro was comparable to aprotinin, which previously saw clinical use to reduce perioperative bleeding. One substrate sequence particularly favoured by KLK4 was substituted into the 14 amino acid, circular sunflower trypsin inhibitor (SFTI). This resulted in a highly potent and selective inhibitor (SFTI-FCQR) which attenuated protease activated receptor signalling by KLK4 in vitro. Moreover, SFTI-FCQR and paclitaxel synergistically reduced growth of ovarian cancer cells in vitro, making this inhibitor a lead compound for further therapeutic development. Similar incorporation of a preferred KLK14 amino acid sequence into the SFTI scaffold produced a potent inhibitor for this protease. However, the conformationally constrained SFTI backbone enforced a different intramolecular cooperativity, which masked a KLK14 specific determinant. As a consequence, the level of selectivity achievable was lower than that found for the KLK4 inhibitor. Standard mechanism inhibitors such as SFTI rely on a stable acyl-enzyme intermediate for high affinity binding. This is achieved by a conformationally constrained canonical binding loop that allows for reformation of the scissile peptide bond after cleavage. Amino acid substitutions within the inhibitor to target a particular protease may compromise structural determinants that support the rigidity of the binding loop and thereby prevent the engineered inhibitor reaching its full potential. An in silico analysis was carried out to examine the potential for further improvements to the potency and selectivity of the SFTI-based KLK4 and KLK14 inhibitors. Molecular dynamics simulations suggested that the substitutions within SFTI required to target KLK4 and KLK14 had compromised the intramolecular hydrogen bond network of the inhibitor and caused a concomitant loss of binding loop stability. Furthermore in silico amino acid substitution revealed a consistent correlation between a higher frequency of formation and the number of internal hydrogen bonds of SFTI-variants and lower inhibition constants. These predictions allowed for the production of second generation inhibitors with enhanced binding affinity toward both targets and highlight the importance of considering intramolecular cooperativity effects when engineering proteins or circular peptides to target proteases. The findings from this study show that although PS-SCLs are a useful tool for high throughput screening of approximate protease preference, later refinement by SML screening is needed to reveal optimal subsite occupancy due to cooperativity in substrate recognition. This investigation has also demonstrated the importance of maintaining structural determinants of backbone constraint and conformation when engineering standard mechanism inhibitors for new targets. Combined these results show that backbone conformation and amino acid cooperativity have more prominent roles than previously appreciated in determining substrate/inhibitor specificity and binding affinity. The three key inhibitors designed during this investigation are now being developed as lead compounds for cancer chemotherapy, control of fibrinolysis and cosmeceutical applications. These compounds form the basis of a portfolio of intellectual property which will be further developed in the coming years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper summarises some of the recent studies on various types of learning approaches that have utilised some form of Web 2.0 services in curriculum design to enhance learning. A generic implementation model of this integration will then be presented to illustrate the overall learning implementation process. Recently, the integration of Web 2.0 technologies into learning curriculum has begun to get a wide acceptance among teaching instructors across various higher learning institutions. This is evidenced by numerous studies which indicate the implementation of a range of Web 2.0 technologies into their learning design to improve learning delivery. Moreover, recent studies also have shown that the ability of current students to embrace Web 2.0 technologies is better than students using existing learning technology. Despite various attempts made by teachers in relation to the integration, researchers have noted a lack of integration standard to help in curriculum design. The absence of this standard will restrict the capacity of Web 2.0 adaptation into learning and adding more the complexity to provide meaningful learning. Therefore, this paper will attempt to draw a conceptual integration model which is being generated to reflect how learning activities with some facilitation of Web 2.0 is currently being implemented. The design of this model is based on shared experiences by many scholars as well as feedback gathered from two separate surveys conducted on teachers and a group of 180 students. Furthermore, this paper also recognizes some key components that generally engage in the design of a Web 2.0 teaching and learning which need to be addressed accordingly. Overall, the content of this paper will be organised as follows. The first part of the paper will introduce the importance of Web 2.0 implementation in teaching and learning from the perspective of higher education institutions and those challenges surrounding this area. The second part summarizes related works done in this field and brings forward the concept of designing learning with the incorporation of Web 2.0 technology. The next part presents the results of analysis derived from the two student and teachers surveys on using Web 2.0 during learning activities. This paper concludes by presenting a model that reflects several key entities that may be involved during the learning design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local climate is a critical element in the design of buildings. In this paper, ten years of historical weather data in Australia's all eight capital cities are analyzed to characterize the variation profiles of climatic variables. The method of descriptive statistics is employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are used to graphically illustrate the similarity and difference between different study locations. It is found that although the weather variables vary with different locations, except for the extreme parts, there is often a good, nearly linear relation between weather variable and its cumulative percentage for the majority of middle part. The implication of these extreme parts and the slopes of the middle parts on building design is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report provides an overview of findings of qualitative research comprising three case studies undertaken as a part of the retrospective analysis component of Sustainable Built Environment National Research Centre (SBEnrc) Project 2.7 Leveraging R&D investment for the Australian Built Environment. These case studies (see Parts 2, 3 and 4 of this suite of reports) were undertaken to illustrate the nature of past R&D investments in Australia. This was done to complement: (i) the audit and analysis of past R&D investment undertaken by Thomas Barlow (2011); and (ii) the Construction 2030 roadmap being developed by Swinburne University of Technology and Professor Göran Roos from VTT Technical Research Centre of Finland. These documents will be the basis for the final phase of the present project - developing policy guidelines for future R&D investment in the Australian built environment. Refer also Parts 1, 2 and 3 for detail findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report discusses findings of a case study into "CADD, BIM and IPD" undertaken as a part of the retrospective analysis component of Sustainable Built Environment National Research Centre (SBEnrc) Project 2.7 Leveraging R&D investment for the Australian Built Environment. This case study investigated the evolution that has taken place in the Queensland Department of Public Works Division of Project Services during the last 20 years from: the initial implementation of computer aided design and documentation(CADD); to the experimentation with building information modelling (BIM) from the mid 2000’s; embedding integrated practice (IP); to current steps towards integrated project delivery (IPD) with the integration of contractors in the design/delivery process. This case study should be read in conjunction with Part 1 of this suite of reports.