882 resultados para multi-method study
Resumo:
The loss of habitat and biodiversity worldwide has led to considerable resources being spent on conservation interventions. Prioritising these actions is challenging due to the complexity of the problem and because there can be multiple actors undertaking conservation actions, often with divergent or partially overlapping objectives. We explore this issue with a simulation study involving two agents sequentially purchasing land for the conservation of multiple species using three scenarios comprising either divergent or partially overlapping objectives between the agents. The first scenario investigates the situation where both agents are targeting different sets of threatened species. The second and third scenarios represent a case where a government agency attempts to implement a complementary conservation network representing 200 species, while a non-government organisation is focused on achieving additional protection for the ten rarest species. Simulated input data was generated using distributions taken from real data to model the cost of parcels, and the rarity and co-occurrence of species. We investigated three types of collaborative interactions between agents: acting in isolation, sharing information and pooling resources with the third option resulting in the agents combining their resources and effectively acting as a single entity. In each scenario we determine the cost savings when an agent moves from acting in isolation to either sharing information or pooling resources with the other agent. The model demonstrates how the value of collaboration can vary significantly in different situations. In most cases, collaborating would have associated costs and these costs need to be weighed against the potential benefits from collaboration. Our model demonstrates a method for determining the range of costs that would result in collaboration providing an efficient use of scarce conservation resources.
Resumo:
We investigate an application of the method of fundamental solutions (MFS) to the backward heat conduction problem (BHCP). We extend the MFS in Johansson and Lesnic (2008) [5] and Johansson et al. (in press) [6] proposed for one and two-dimensional direct heat conduction problems, respectively, with the sources placed outside the space domain of interest. Theoretical properties of the method, as well as numerical investigations, are included, showing that accurate and stable results can be obtained efficiently with small computational cost.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Transportation service operators are witnessing a growing demand for bi-directional movement of goods. Given this, the following thesis considers an extension to the vehicle routing problem (VRP) known as the delivery and pickup transportation problem (DPP), where delivery and pickup demands may occupy the same route. The problem is formulated here as the vehicle routing problem with simultaneous delivery and pickup (VRPSDP), which requires the concurrent service of the demands at the customer location. This formulation provides the greatest opportunity for cost savings for both the service provider and recipient. The aims of this research are to propose a new theoretical design to solve the multi-objective VRPSDP, provide software support for the suggested design and validate the method through a set of experiments. A new real-life based multi-objective VRPSDP is studied here, which requires the minimisation of the often conflicting objectives: operated vehicle fleet size, total routing distance and the maximum variation between route distances (workload variation). The former two objectives are commonly encountered in the domain and the latter is introduced here because it is essential for real-life routing problems. The VRPSDP is defined as a hard combinatorial optimisation problem, therefore an approximation method, Simultaneous Delivery and Pickup method (SDPmethod) is proposed to solve it. The SDPmethod consists of three phases. The first phase constructs a set of diverse partial solutions, where one is expected to form part of the near-optimal solution. The second phase determines assignment possibilities for each sub-problem. The third phase solves the sub-problems using a parallel genetic algorithm. The suggested genetic algorithm is improved by the introduction of a set of tools: genetic operator switching mechanism via diversity thresholds, accuracy analysis tool and a new fitness evaluation mechanism. This three phase method is proposed to address the shortcoming that exists in the domain, where an initial solution is built only then to be completely dismantled and redesigned in the optimisation phase. In addition, a new routing heuristic, RouteAlg, is proposed to solve the VRPSDP sub-problem, the travelling salesman problem with simultaneous delivery and pickup (TSPSDP). The experimental studies are conducted using the well known benchmark Salhi and Nagy (1999) test problems, where the SDPmethod and RouteAlg solutions are compared with the prominent works in the VRPSDP domain. The SDPmethod has demonstrated to be an effective method for solving the multi-objective VRPSDP and the RouteAlg for the TSPSDP.
Resumo:
This study presents the first part of a CFD study on the performance of a downer reactor for biomass pyrolysis. The reactor was equipped with a novel gas-solid separation method, developed by the co-authors from the ICFAR (Canada). The separator, which was designed to allow for fast separation of clean pyrolysis gas, consisted of a cone deflector and a gas exit pipe installed inside the downer reactor. A multi-fluid model (Eulerian-Eulerian) with constitutive relations adopted from the kinetic theory of granular flow was used to simulate the multiphase flow. The effects of the various parameters including operation conditions, separator geometry and particle properties on the overall hydrodynamics and separation efficiency were investigated. The model prediction of the separator efficiency was compared with experimental measurements. The results revealed distinct hydrodynamic features around the cone separator, allowing for up to 100% separation efficiency. The developed model provided a platform for the second part of the study, where the biomass pyrolysis is simulated and the product quality as a function of operating conditions is analyzed. Crown Copyright © 2014 Published by Elsevier B.V. All rights reserved.
Resumo:
We develop an analytical methodology for optimizing phase regeneration based on phase sensitive amplification. The results demonstrate the scalability of the scheme and show the significance of simultaneous optimization of transfer function and the signal alphabet.
Resumo:
In this paper is proposed a model for researching the capability to influence, by selected methods’ groups of compression, to the co-efficient of information security of selected objects’ groups, exposed to selected attacks’ groups. With the help of methods for multi-criteria evaluation are chosen the methods’ groups with the lowest risk with respect to the information security. Recommendations for future investigations are proposed.
Resumo:
This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.
Resumo:
In the article it is considered preconditions and main principles of creation of virtual laboratories for computer-aided design, as tools for interdisciplinary researches. Virtual laboratory, what are offered, is worth to be used on the stage of the requirements specification or EFT-stage, because it gives the possibility of fast estimating of the project realization, certain characteristics and, as a result, expected benefit of its applications. Using of these technologies already increase automation level of design stages of new devices for different purposes. Proposed computer technology gives possibility to specialists from such scientific fields, as chemistry, biology, biochemistry, physics etc, to check possibility of device creating on the basis of developed sensors. It lets to reduce terms and costs of designing of computer devices and systems on the early stages of designing, for example on the stage of requirements specification or EFT-stage. An important feature of this project is using the advanced multi-dimensional access method for organizing the information base of the Virtual laboratory.
Resumo:
The popularity of online social media platforms provides an unprecedented opportunity to study real-world complex networks of interactions. However, releasing this data to researchers and the public comes at the cost of potentially exposing private and sensitive user information. It has been shown that a naive anonymization of a network by removing the identity of the nodes is not sufficient to preserve users’ privacy. In order to deal with malicious attacks, k -anonymity solutions have been proposed to partially obfuscate topological information that can be used to infer nodes’ identity. In this paper, we study the problem of ensuring k anonymity in time-varying graphs, i.e., graphs with a structure that changes over time, and multi-layer graphs, i.e., graphs with multiple types of links. More specifically, we examine the case in which the attacker has access to the degree of the nodes. The goal is to generate a new graph where, given the degree of a node in each (temporal) layer of the graph, such a node remains indistinguishable from other k-1 nodes in the graph. In order to achieve this, we find the optimal partitioning of the graph nodes such that the cost of anonymizing the degree information within each group is minimum. We show that this reduces to a special case of a Generalized Assignment Problem, and we propose a simple yet effective algorithm to solve it. Finally, we introduce an iterated linear programming approach to enforce the realizability of the anonymized degree sequences. The efficacy of the method is assessed through an extensive set of experiments on synthetic and real-world graphs.
Resumo:
Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.
Resumo:
Stylization is a common method of ornamental plant use that imitates nature and evokes the scenery. This paper discloses a not yet proposed aspect of stylization, since the method offers the possibility of preserving the physiognomy of those habitats that seem to vanish due to future climate change. In addition, novelty of the method is founded also on that vulnerability of the Hungarian habitats has been examined by the researchers only from the botanical and ecological point of view so far and not in terms of its landscape design value. In Hungary, acidofrequent mixed forests appear to be highly sensitive to climate change according to ecological models. We are going to discuss the methodology of stylization of climate sensitive habitats and briefly refer to acidofrequent mixed forests as a case study. Those coniferous and deciduous tree species of the studied habitat that are water demanding are proposed to be substituted by drought tolerant ones with similar characteristics, and an optionally expandable list of these taxa is presented. Based on this the authors suggest experimental investigations of those of the proposed taxa for which the higher drought tolerance is based on observations only.
Resumo:
The most fundamental and challenging function of government is the effective and efficient delivery of services to local taxpayers and businesses. Counties, once known as the “dark continent” of American government, have recently become a major player in the provision of services. Population growth and suburbanization have increased service demands while the counties' role as service provider to incorporated residents has also expanded due to additional federal and state mandates. County governments are under unprecedented pressure and scrutiny to meet citizens' and elected officials' demands for high quality, and equitable delivery of services at the lowest possible cost while contending with anti-tax sentiments, greatly decreased state and federal support, and exceptionally costly and complex health and public safety problems. ^ This study tested the reform government theory proposition that reformed structures of county government positively correlate with efficient service delivery. A county government reformed index was developed for this dissertation comprised of form of government, home-rule status, method of election, number of government jurisdictions, and number of elected officials. The county government reform index and a measure of relative structural fragmentation were used to assess their impact on two measures of service output: mean county road pavement condition and county road maintenance expenditures. The study's multi-level design triangulated results from different data sources and methods of analysis. Data were collected from semi-structured interviews of county officials, secondary archival sources, and a survey of 544 elected and appointed officials from Florida's 67 counties. The results of the three sources of data converged in finding that reformed Florida counties are more likely than unreformed counties to provide better road service and to spend less on road expenditures. The same results were found for unfragmented Florida counties. Because both the county government reform index and the fragmentation variables were specified acknowledging the reform theory as well as elements from the public-choice model, the results help explain contradicting findings in the urban service research. ^ Therefore, as suggested by the corroborated findings of this dissertation, reformed as well as unfragmented counties are better providers of road maintenance service and do so in a less costly manner. These findings hold although the variables were specified to capture theoretical arguments from the consolidated as well as the public-choice theories suggesting a way to advance the debate from the consolidated-fragmented dichotomy of urban governance. ^
Resumo:
Numerical optimization is a technique where a computer is used to explore design parameter combinations to find extremes in performance factors. In multi-objective optimization several performance factors can be optimized simultaneously. The solution to multi-objective optimization problems is not a single design, but a family of optimized designs referred to as the Pareto frontier. The Pareto frontier is a trade-off curve in the objective function space composed of solutions where performance in one objective function is traded for performance in others. A Multi-Objective Hybridized Optimizer (MOHO) was created for the purpose of solving multi-objective optimization problems by utilizing a set of constituent optimization algorithms. MOHO tracks the progress of the Pareto frontier approximation development and automatically switches amongst those constituent evolutionary optimization algorithms to speed the formation of an accurate Pareto frontier approximation. Aerodynamic shape optimization is one of the oldest applications of numerical optimization. MOHO was used to perform shape optimization on a 0.5-inch ballistic penetrator traveling at Mach number 2.5. Two objectives were simultaneously optimized: minimize aerodynamic drag and maximize penetrator volume. This problem was solved twice. The first time the problem was solved by using Modified Newton Impact Theory (MNIT) to determine the pressure drag on the penetrator. In the second solution, a Parabolized Navier-Stokes (PNS) solver that includes viscosity was used to evaluate the drag on the penetrator. The studies show the difference in the optimized penetrator shapes when viscosity is absent and present in the optimization. In modern optimization problems, objective function evaluations may require many hours on a computer cluster to perform these types of analysis. One solution is to create a response surface that models the behavior of the objective function. Once enough data about the behavior of the objective function has been collected, a response surface can be used to represent the actual objective function in the optimization process. The Hybrid Self-Organizing Response Surface Method (HYBSORSM) algorithm was developed and used to make response surfaces of objective functions. HYBSORSM was evaluated using a suite of 295 non-linear functions. These functions involve from 2 to 100 variables demonstrating robustness and accuracy of HYBSORSM.
Resumo:
Carbon nanotubes (CNT) could serve as potential reinforcement for metal matrix composites for improved mechanical properties. However dispersion of carbon nanotubes (CNT) in the matrix has been a longstanding problem, since they tend to form clusters to minimize their surface area. The aim of this study was to use plasma and cold spraying techniques to synthesize CNT reinforced aluminum composite with improved dispersion and to quantify the degree of CNT dispersion as it influences the mechanical properties. Novel method of spray drying was used to disperse CNTs in Al-12 wt.% Si prealloyed powder, which was used as feedstock for plasma and cold spraying. A new method for quantification of CNT distribution was developed. Two parameters for CNT dispersion quantification, namely Dispersion parameter (DP) and Clustering Parameter (CP) have been proposed based on the image analysis and distance between the centers of CNTs. Nanomechanical properties were correlated with the dispersion of CNTs in the microstructure. Coating microstructure evolution has been discussed in terms of splat formation, deformation and damage of CNTs and CNT/matrix interface. Effect of Si and CNT content on the reaction at CNT/matrix interface was thermodynamically and kinetically studied. A pseudo phase diagram was computed which predicts the interfacial carbide for reaction between CNT and Al-Si alloy at processing temperature. Kinetic aspects showed that Al4C3 forms with Al-12 wt.% Si alloy while SiC forms with Al-23wt.% Si alloy. Mechanical properties at nano, micro and macro-scale were evaluated using nanoindentation and nanoscratch, microindentation and bulk tensile testing respectively. Nano and micro-scale mechanical properties (elastic modulus, hardness and yield strength) displayed improvement whereas macro-scale mechanical properties were poor. The inversion of the mechanical properties at different scale length was attributed to the porosity, CNT clustering, CNT-splat adhesion and Al 4C3 formation at the CNT/matrix interface. The Dispersion parameter (DP) was more sensitive than Clustering parameter (CP) in measuring degree of CNT distribution in the matrix.