259 resultados para Minimisation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hybrid simulation is a technique that combines experimental and numerical testing and has been used for the last decades in the fields of aerospace, civil and mechanical engineering. During this time, most of the research has focused on developing algorithms and the necessary technology, including but not limited to, error minimisation techniques, phase lag compensation and faster hydraulic cylinders. However, one of the main shortcomings in hybrid simulation that has pre- vented its widespread use is the size of the numerical models and the effect that higher frequencies may have on the stability and accuracy of the simulation. The first chapter in this document provides an overview of the hybrid simulation method and the different hybrid simulation schemes, and the corresponding time integration algorithms, that are more commonly used in this field. The scope of this thesis is presented in more detail in chapter 2: a substructure algorithm, the Substep Force Feedback (Subfeed), is adapted in order to fulfil the necessary requirements in terms of speed. The effects of more complex models on the Subfeed are also studied in detail, and the improvements made are validated experimentally. Chapters 3 and 4 detail the methodologies that have been used in order to accomplish the objectives mentioned in the previous lines, listing the different cases of study and detailing the hardware and software used to experimentally validate them. The third chapter contains a brief introduction to a project, the DFG Subshake, whose data have been used as a starting point for the developments that are shown later in this thesis. The results obtained are presented in chapters 5 and 6, with the first of them focusing on purely numerical simulations while the second of them is more oriented towards a more practical application including experimental real-time hybrid simulation tests with large numerical models. Following the discussion of the developments in this thesis is a list of hardware and software requirements that have to be met in order to apply the methods described in this document, and they can be found in chapter 7. The last chapter, chapter 8, of this thesis focuses on conclusions and achievements extracted from the results, namely: the adaptation of the hybrid simulation algorithm Subfeed to be used in conjunction with large numerical models, the study of the effect of high frequencies on the substructure algorithm and experimental real-time hybrid simulation tests with vibrating subsystems using large numerical models and shake tables. A brief discussion of possible future research activities can be found in the concluding chapter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anaphylaxis is a serious, rare condition increasing in prevalence. This study explored the psychological experience of adult-onset anaphylaxis from patient, family and staff perspectives. Semi-structured interviews were conducted with twelve participants. Two global themes emerged from thematic analysis: ‘controllability’ (‘an unknown and distressing experience’, ‘the importance of control over triggers’ and ‘responsibility but no control: the impact on others’) and ‘conflict’ (‘rejecting illness identity’, ‘minimisation of risk’, ‘accessing specialist care: running in slow motion’ and ‘patient-centred versus service-centred care’). Findings highlight the importance of perceived control and emphasise the presence of conflict in the experience of this complex, episodic condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le Système Stockage de l’Énergie par Batterie ou Batterie de Stockage d’Énergie (BSE) offre de formidables atouts dans les domaines de la production, du transport, de la distribution et de la consommation d’énergie électrique. Cette technologie est notamment considérée par plusieurs opérateurs à travers le monde entier, comme un nouveau dispositif permettant d’injecter d’importantes quantités d’énergie renouvelable d’une part et d’autre part, en tant que composante essentielle aux grands réseaux électriques. De plus, d’énormes avantages peuvent être associés au déploiement de la technologie du BSE aussi bien dans les réseaux intelligents que pour la réduction de l’émission des gaz à effet de serre, la réduction des pertes marginales, l’alimentation de certains consommateurs en source d’énergie d’urgence, l’amélioration de la gestion de l’énergie, et l’accroissement de l’efficacité énergétique dans les réseaux. Cette présente thèse comprend trois étapes à savoir : l’Étape 1 - est relative à l’utilisation de la BSE en guise de réduction des pertes électriques ; l’Étape 2 - utilise la BSE comme élément de réserve tournante en vue de l’atténuation de la vulnérabilité du réseau ; et l’Étape 3 - introduit une nouvelle méthode d’amélioration des oscillations de fréquence par modulation de la puissance réactive, et l’utilisation de la BSE pour satisfaire la réserve primaire de fréquence. La première Étape, relative à l’utilisation de la BSE en vue de la réduction des pertes, est elle-même subdivisée en deux sous-étapes dont la première est consacrée à l’allocation optimale et le seconde, à l’utilisation optimale. Dans la première sous-étape, l’Algorithme génétique NSGA-II (Non-dominated Sorting Genetic Algorithm II) a été programmé dans CASIR, le Super-Ordinateur de l’IREQ, en tant qu’algorithme évolutionniste multiobjectifs, permettant d’extraire un ensemble de solutions pour un dimensionnement optimal et un emplacement adéquat des multiple unités de BSE, tout en minimisant les pertes de puissance, et en considérant en même temps la capacité totale des puissances des unités de BSE installées comme des fonctions objectives. La première sous-étape donne une réponse satisfaisante à l’allocation et résout aussi la question de la programmation/scheduling dans l’interconnexion du Québec. Dans le but de réaliser l’objectif de la seconde sous-étape, un certain nombre de solutions ont été retenues et développées/implantées durant un intervalle de temps d’une année, tout en tenant compte des paramètres (heure, capacité, rendement/efficacité, facteur de puissance) associés aux cycles de charge et de décharge de la BSE, alors que la réduction des pertes marginales et l’efficacité énergétique constituent les principaux objectifs. Quant à la seconde Étape, un nouvel indice de vulnérabilité a été introduit, formalisé et étudié ; indice qui est bien adapté aux réseaux modernes équipés de BES. L’algorithme génétique NSGA-II est de nouveau exécuté (ré-exécuté) alors que la minimisation de l’indice de vulnérabilité proposé et l’efficacité énergétique représentent les principaux objectifs. Les résultats obtenus prouvent que l’utilisation de la BSE peut, dans certains cas, éviter des pannes majeures du réseau. La troisième Étape expose un nouveau concept d’ajout d’une inertie virtuelle aux réseaux électriques, par le procédé de modulation de la puissance réactive. Il a ensuite été présenté l’utilisation de la BSE en guise de réserve primaire de fréquence. Un modèle générique de BSE, associé à l’interconnexion du Québec, a enfin été proposé dans un environnement MATLAB. Les résultats de simulations confirment la possibilité de l’utilisation des puissances active et réactive du système de la BSE en vue de la régulation de fréquence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central product of the DRAMA (Dynamic Re-Allocation of Meshes for parallel Finite Element Applications) project is a library comprising a variety of tools for dynamic re-partitioning of unstructured Finite Element (FE) applications. The input to the DRAMA library is the computational mesh, and corresponding costs, partitioned into sub-domains. The core library functions then perform a parallel computation of a mesh re-allocation that will re-balance the costs based on the DRAMA cost model. We discuss the basic features of this cost model, which allows a general approach to load identification, modelling and imbalance minimisation. Results from crash simulations are presented which show the necessity for multi-phase/multi-constraint partitioning components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virus and soil borne pathogens negatively impact on the production of potatoes in tropical highland and sub-tropical environments, limiting supply of an increasingly popular and important vegetable in these regions. It is common for latent disease infected seed tubers or field grown cuttings to be used as potato planting material. We utilised an International Potato Centre technique, using aeroponic technology, to produce low cost mini-tubers in tropical areas. The system has been optimised for increased effectiveness in tropical areas. High numbers of seed tubers of cultivar Sebago (630) and Nicola per m2 (>900) were obtained in the first generation, and the system is capable of producing five crops of standard cultivars in every two years. Initial results indicate that quality seed could be produced by nurseries and farmers, therefore contributing to the minimisation of soil borne diseases in an integrated management plan. This technology reduces seed production costs, benefiting seed and potato growers. © ISHS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

People go through their life making all kinds of decisions, and some of these decisions affect their demand for transportation, for example, their choices of where to live and where to work, how and when to travel and which route to take. Transport related choices are typically time dependent and characterized by large number of alternatives that can be spatially correlated. This thesis deals with models that can be used to analyze and predict discrete choices in large-scale networks. The proposed models and methods are highly relevant for, but not limited to, transport applications. We model decisions as sequences of choices within the dynamic discrete choice framework, also known as parametric Markov decision processes. Such models are known to be difficult to estimate and to apply to make predictions because dynamic programming problems need to be solved in order to compute choice probabilities. In this thesis we show that it is possible to explore the network structure and the flexibility of dynamic programming so that the dynamic discrete choice modeling approach is not only useful to model time dependent choices, but also makes it easier to model large-scale static choices. The thesis consists of seven articles containing a number of models and methods for estimating, applying and testing large-scale discrete choice models. In the following we group the contributions under three themes: route choice modeling, large-scale multivariate extreme value (MEV) model estimation and nonlinear optimization algorithms. Five articles are related to route choice modeling. We propose different dynamic discrete choice models that allow paths to be correlated based on the MEV and mixed logit models. The resulting route choice models become expensive to estimate and we deal with this challenge by proposing innovative methods that allow to reduce the estimation cost. For example, we propose a decomposition method that not only opens up for possibility of mixing, but also speeds up the estimation for simple logit models, which has implications also for traffic simulation. Moreover, we compare the utility maximization and regret minimization decision rules, and we propose a misspecification test for logit-based route choice models. The second theme is related to the estimation of static discrete choice models with large choice sets. We establish that a class of MEV models can be reformulated as dynamic discrete choice models on the networks of correlation structures. These dynamic models can then be estimated quickly using dynamic programming techniques and an efficient nonlinear optimization algorithm. Finally, the third theme focuses on structured quasi-Newton techniques for estimating discrete choice models by maximum likelihood. We examine and adapt switching methods that can be easily integrated into usual optimization algorithms (line search and trust region) to accelerate the estimation process. The proposed dynamic discrete choice models and estimation methods can be used in various discrete choice applications. In the area of big data analytics, models that can deal with large choice sets and sequential choices are important. Our research can therefore be of interest in various demand analysis applications (predictive analytics) or can be integrated with optimization models (prescriptive analytics). Furthermore, our studies indicate the potential of dynamic programming techniques in this context, even for static models, which opens up a variety of future research directions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pyrolysis is the thermo-chemical conversion of carbonaceous feedstock in the absence of oxygen to produce bio-fuel (bio-oil, bio-char and syn-gas). Bio-fuel production from municipal green waste (MGW) through the pyrolysis process has attracted considerable attention recently in the renewable energy sector because it can reduce greenhouse gas emissions and contribute to energy security. This study analyses properties of MGW feedstock available in Rockhampton city of Central Queensland, Australia, and presents an experimental investigation of producing bio-fuel from that MGW through the pyrolysis process using a short sealed rotary furnace. It was found from the experiment that about 19.97% bio-oil, 40.83% bio-char and 29.77% syn-gas can be produced from the MGW. Then, a four-stage steady state simulation model is developed for pyrolysis process performance simulation using Aspen Plus software. In the first stage, the moisture content of the MGW feed is reduced. In the second stage, the MGW is decomposed according to its elemental constituents. In the third stage, condensate material is separated and, finally, the pyrolysis reactions are modelled using the Gibbs free energy minimisation approach. The MGW's ultimate and proximate analysis data were used in the Aspen Plus simulation as input parameters. The model is validated with experimentally measured data. A good agreement between simulation and experimental results was found. More specifically, the variation of modelling and experimental elemental compositions of the MGW was found to be 7.3% for carbon, 15.82% for hydrogen, 7.04% for nitrogen and 5.56% for sulphur. The validated model is used to optimise the biofuel production from the MGW as a function of operating variables such as temperature, moisture content, particle size and process heat air-fuel ratio. The modelling and optimisation results are presented, analysed and discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, fault-tolerant control of redundant planar serial manipulators has been investigated experimentally via an offline method called psuedo-inverse reconfiguration method. Minimizing the end-effector's velocity jump via an optimal mapping of joint failures into healthy joints velocity space can be regarded as the main contribution of this reconfiguration approach. This algorithm has been simulated and implemented on a four-link serial manipulator named as TaArm. It should be mentioned that for simulation and practical tests, C++ programming language in QtCreator environment has been used which provided high computational speed. Two scenarios has been selected for simulation and implementation studies and results shows that the algorithm considerably removes the velocity jump of the end-effector in both simulation and experimetal studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: This study used matched samples from schools in the states of Victoria and Washington to compare sexual behaviour in early adolescence. It was hypothesised that the contrasting dominant policy objectives of harm minimisation in Australia and abstinence in the USA would result in state differences for markers of sexual risk, mirroring prior cross-national findings in substance use. Method: A two-stage cluster sampling approach was used to recruit students from the two states. Self-reported sexual behaviour was examined for 1,596 students in annual surveys from Grade 7 in 2002 to Grade 9 in 2004. Prevalence estimates were derived for each measure of sexual behaviour, and comparisons were made between gender groups in each state. Results: State differences were found for girls' first sex, with significantly more girls in Washington than Victoria having had sex by Grade 7. By Grade 9, significantly more girls in Victoria reported sex in the last year and more sexual partners than girls in Washington. A large proportion of Grade 9 students across both states reported inconsistent contraception use. Conclusions: Contradicting the abstinence policy objective, first sex by Grade 7 was more prevalent in Washington than in Victoria. While sexual behaviour was more prevalent in Grade 9 in Victoria, the sexually active showed no clear cross-national differences in markers of risk such as contraception use and pregnancy outcomes. Findings demonstrate few cross-national differences in adolescent sexual behaviour despite the different policy contexts of Victoria and Washington.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study of recreational drug use among workers in the Port Lincoln mariculture and seafood industries was conducted by self report questionnaire. High rates of cannabis and alcohol use were revealed during the shore based fish farming season. The occupational health and safety implications of these findings in one of Australia's most dangerous industries are significant. Further research could inform the development of industry specific harm minimisation policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The construction industry has been found to be a major generator of waste and there are many challenges associated with finding the most sustainable way to manage construction waste. As the construction industry is a project based industry, it is essential to look at cultural issues related to waste management at the project level. Therefore, this research aims to identify the current status of waste management practices in construction projects by analysing project managers’ views on waste management performance in construction projects; project managers’ attitudes towards waste management; and project managers’ views on waste management culture in construction projects. A questionnaire survey was carried out and project managers were selected as a target group to distribute questionnaires, as project managers have a vital involvement in promoting and maintaining project culture in the construction project environment. Data was analysed using descriptive statistics and the Kruskal-Wallis test. The findings reveal that project managers believe that even though the operational cost of waste minimisation is high in construction projects, overall waste management is profitable. At the same time it was interesting that even though project managers believe most project participants are satisfied with existing waste management systems, overall waste management efforts are not perceived as being at a satisfactory level in construction projects. Project managers consider waste as an inevitable by-product, but they do not believe that waste management is beyond the control of project members or that waste has no value. At the same time, it was found that project managers infer that project participants are cost and time conscious in waste management despite the roles, responsibilities and duties of each party in waste management not being well-coordinated or fully understood. Taken together, these findings highlight the misconceptions related to waste management in construction projects and emphasise the necessity of collective responsibility on the part of project participants to enhance the performance of waste management in construction projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nous adaptons une heuristique de recherche à voisinage variable pour traiter le problème du voyageur de commerce avec fenêtres de temps (TSPTW) lorsque l'objectif est la minimisation du temps d'arrivée au dépôt de destination. Nous utilisons des méthodes efficientes pour la vérification de la réalisabilité et de la rentabilité d'un mouvement. Nous explorons les voisinages dans des ordres permettant de réduire l'espace de recherche. La méthode résultante est compétitive avec l'état de l'art. Nous améliorons les meilleures solutions connues pour deux classes d'instances et nous fournissons les résultats de plusieurs instances du TSPTW pour la première fois.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes and investigates a metaheuristic tabu search algorithm (TSA) that generates optimal or near optimal solutions sequences for the feedback length minimization problem (FLMP) associated to a design structure matrix (DSM). The FLMP is a non-linear combinatorial optimization problem, belonging to the NP-hard class, and therefore finding an exact optimal solution is very hard and time consuming, especially on medium and large problem instances. First, we introduce the subject and provide a review of the related literature and problem definitions. Using the tabu search method (TSM) paradigm, this paper presents a new tabu search algorithm that generates optimal or sub-optimal solutions for the feedback length minimization problem, using two different neighborhoods based on swaps of two activities and shifting an activity to a different position. Furthermore, this paper includes numerical results for analyzing the performance of the proposed TSA and for fixing the proper values of its parameters. Then we compare our results on benchmarked problems with those already published in the literature. We conclude that the proposed tabu search algorithm is very promising because it outperforms the existing methods, and because no other tabu search method for the FLMP is reported in the literature. The proposed tabu search algorithm applied to the process layer of the multidimensional design structure matrices proves to be a key optimization method for an optimal product development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nous adaptons une heuristique de recherche à voisinage variable pour traiter le problème du voyageur de commerce avec fenêtres de temps (TSPTW) lorsque l'objectif est la minimisation du temps d'arrivée au dépôt de destination. Nous utilisons des méthodes efficientes pour la vérification de la réalisabilité et de la rentabilité d'un mouvement. Nous explorons les voisinages dans des ordres permettant de réduire l'espace de recherche. La méthode résultante est compétitive avec l'état de l'art. Nous améliorons les meilleures solutions connues pour deux classes d'instances et nous fournissons les résultats de plusieurs instances du TSPTW pour la première fois.