821 resultados para Multi-criteria analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a multi-criteria based approach for nondestructive diagnostic structural integrity assessment of a decommissioned flatbed rail wagon (FBRW) used for road bridge superstructure rehabilitation and replacement applications. First, full-scale vibration and static test data sets are employed in a FE model of the FBRW to obtain the best ‘initial’ estimate of the model parameters. Second, the ‘final’ model parameters are predicted using sensitivity-based perturbation analysis without significant difficulties encountered. Consequently, the updated FBRW model is validated using the independent sets of full-scale laboratory static test data. Finally, the updated and validated FE model of the FBRW is used for structural integrity assessment of a single lane FBRW bridge subjected to the Australian bridge design traffic load.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Several genetic variants are thought to influence white matter (WM) integrity, measured with diffusion tensor imaging (DTI). Voxel based methods can test genetic associations, but heavy multiple comparisons corrections are required to adjust for searching the whole brain and for all genetic variants analyzed. Thus, genetic associations are hard to detect even in large studies. Using a recently developed multi-SNP analysis, we examined the joint predictive power of a group of 18 cholesterol-related single nucleotide polymorphisms (SNPs) on WM integrity, measured by fractional anisotropy. To boost power, we limited the analysis to brain voxels that showed significant associations with total serum cholesterol levels. From this space, we identified two genes with effects that replicated in individual voxel-wise analyses of the whole brain. Multivariate analyses of genetic variants on a reduced anatomical search space may help to identify SNPs with strongest effects on the brain from a broad panel of genes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi-species fisheries are complex to manage and the ability to develop an appropriate governance structure is often seriously impeded because trading between sustainability objectives at the species level, economic objectives at the fleet level, and social objectives at the community scale, is complex. Many of these fisheries also tend to have a mix of information, with stock assessments available for some species and almost no information on other species. The fleets themselves comprise fishers from small family enterprises to large vertically integrated businesses. The Queensland trawl fishery in Australia is used as a case study for this kind of fishery. It has the added complexity that a large part of the fishery is within a World Heritage Area, the Great Barrier Reef Marine Park, which is managed by an agency of the Australian Commonwealth Government whereas the fishery itself is managed by the Queensland State Government. A stakeholder elicitation process was used to develop social, governance, economic and ecological objectives, and then weight the relative importance of these. An expert group was used to develop different governance strawmen (or management strategies) and these were assessed by a group of industry stakeholders and experts using multi-criteria decision analysis techniques against the different objectives. One strawman clearly provided the best overall set of outcomes given the multiple objectives, but was not optimal in terms of every objective, demonstrating that even the "best" strawman may be less than perfect. © 2012.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rationing healthcare in some form is inevitable, even in wealthy countries, because resources are scarce and demand for healthcare is always likely to exceed supply. This means that decision-makers must make choices about which health programs and initiatives should receive public funding and which ones should not. These choices are often difficult to make, particularly in Australia, because: - 1 Make explicit rationing based on a national decision-making tool (such as Multi-criteria Decision Analysis) standard process in all jurisdictions. - 2 Develop nationally consistent methods for conducting economic evaluation in health so that good quality evidence on the relative efficiency of various programs and initiatives is generated. - 3 Generate more economic evaluation evidence to inform rationing decisions. - 4 Revise national health performance indicators so that they include true health system efficiency indicators, such as cost-effectiveness. - 5 Apply the Comprehensive Management Framework used to evaluate items on the Medicare Benefits Schedule (MBS) to the Pharmaceutical Benefits Scheme (PBS) and the Prosthesis List to accelerate disinvestment from low-value drugs and prostheses. - 6 Seek agreement among Commonwealth, state and territory governments to work together to undertake work similar to the National Institute for Health and Care Excellence in the United Kingdom and the Canadian Agency for Drugs and Technologies in Health.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents an overview of seismic microzonation and existing methodologies with a newly proposed methodology covering all aspects. Earlier seismic microzonation methods focused on parameters that affect the structure or foundation related problems. But seismic microzonation has generally been recognized as an important component of urban planning and disaster management. So seismic microzonation should evaluate all possible hazards due to earthquake and represent the same by spatial distribution. This paper presents a new methodology for seismic microzonation which has been generated based on location of study area and possible associated hazards. This new method consists of seven important steps with defined output for each step and these steps are linked with each other. Addressing one step and respective result may not be seismic microzonation, which is practiced widely. This paper also presents importance of geotechnical aspects in seismic microzonation and how geotechnical aspects affect the final map. For the case study, seismic hazard values at rock level are estimated considering the seismotectonic parameters of the region using deterministic and probabilistic seismic hazard analysis. Surface level hazard values are estimated considering site specific study and local site effects based on site classification/characterization. The liquefaction hazard is estimated using standard penetration test data. These hazard parameters are integrated in Geographical Information System (GIS) using Analytic Hierarchy Process (AHP) and used to estimate hazard index. Hazard index is arrived by following a multi-criteria evaluation technique - AHP, in which each theme and features have been assigned weights and then ranked respectively according to a consensus opinion about their relative significance to the seismic hazard. The hazard values are integrated through spatial union to obtain the deterministic microzonation map and probabilistic microzonation map for a specific return period. Seismological parameters are widely used for microzonation rather than geotechnical parameters. But studies show that the hazard index values are based on site specific geotechnical parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A supply chain ecosystem consists of the elements of the supply chain and the entities that influence the goods, information and financial flows through the supply chain. These influences come through government regulations, human, financial and natural resources, logistics infrastructure and management, etc., and thus affect the supply chain performance. Similarly, all the ecosystem elements also contribute to the risk. The aim of this paper is to identify both performances-based and risk-based decision criteria, which are important and critical to the supply chain. A two step approach using fuzzy AHP and fuzzy technique for order of preference by similarity to ideal solution has been proposed for multi-criteria decision-making and illustrated using a numerical example. The first step does the selection without considering risks and then in the next step suppliers are ranked according to their risk profiles. Later, the two ranks are consolidated into one. In subsequent section, the method is also extended for multi-tier supplier selection. In short, we are presenting a method for the design of a resilient supply chain, in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Compliant mechanisms are elastic continua used to transmit or transform force and motion mechanically. The topology optimization methods developed for compliant mechanisms also give the shape for a chosen parameterization of the design domain with a fixed mesh. However, in these methods, the shapes of the flexible segments in the resulting optimal solutions are restricted either by the type or the resolution of the design parameterization. This limitation is overcome in this paper by focusing on optimizing the skeletal shape of the compliant segments in a given topology. It is accomplished by identifying such segments in the topology and representing them using Bezier curves. The vertices of the Bezier control polygon are used to parameterize the shape-design space. Uniform parameter steps of the Bezier curves naturally enable adaptive finite element discretization of the segments as their shapes change. Practical constraints such as avoiding intersections with other segments, self-intersections, and restrictions on the available space and material, are incorporated into the formulation. A multi-criteria function from our prior work is used as the objective. Analytical sensitivity analysis for the objective and constraints is presented and is used in the numerical optimization. Examples are included to illustrate the shape optimization method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Structural design is a decision-making process in which a wide spectrum of requirements, expectations, and concerns needs to be properly addressed. Engineering design criteria are considered together with societal and client preferences, and most of these design objectives are affected by the uncertainties surrounding a design. Therefore, realistic design frameworks must be able to handle multiple performance objectives and incorporate uncertainties from numerous sources into the process.

In this study, a multi-criteria based design framework for structural design under seismic risk is explored. The emphasis is on reliability-based performance objectives and their interaction with economic objectives. The framework has analysis, evaluation, and revision stages. In the probabilistic response analysis, seismic loading uncertainties as well as modeling uncertainties are incorporated. For evaluation, two approaches are suggested: one based on preference aggregation and the other based on socio-economics. Both implementations of the general framework are illustrated with simple but informative design examples to explore the basic features of the framework.

The first approach uses concepts similar to those found in multi-criteria decision theory, and directly combines reliability-based objectives with others. This approach is implemented in a single-stage design procedure. In the socio-economics based approach, a two-stage design procedure is recommended in which societal preferences are treated through reliability-based engineering performance measures, but emphasis is also given to economic objectives because these are especially important to the structural designer's client. A rational net asset value formulation including losses from uncertain future earthquakes is used to assess the economic performance of a design. A recently developed assembly-based vulnerability analysis is incorporated into the loss estimation.

The presented performance-based design framework allows investigation of various design issues and their impact on a structural design. It is a flexible one that readily allows incorporation of new methods and concepts in seismic hazard specification, structural analysis, and loss estimation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: The paper examines how a number of key themes are introduced in the Masters programme in Engineering for Sustainable Development at Cambridge University through student centred activities. These themes include dealing with complexity, uncertainty, change, other disciplines, people, environmental limits, whole life costs, and trade-offs. Design/methodology/approach: The range of exercises and assignments designed to encourage students to test their own assumptions and abilities to develop competencies in these areas are analysed by mapping the key themes onto the formal activities which all students undertake throughout the core MPhil programme. The paper reviews the range of these activities that are designed to help support the formal delivery of the taught programme. These include residential field courses, role plays, change challenges, games, systems thinking, multi criteria decision making, awareness of literature from other disciplines and consultancy projects. An axial coding approach to the analysis of routine feedback questionnaires drawn from recent years has been used to identify how student’s own awareness develops. Also results of two surveys are presented which tests the students’ perceptions about whether or not the course is providing learning environments to develop awareness and skills in these areas. Findings: Students generally perform well against these tasks with a significant feature being the mutual support they give to each other in their learning. The paper concludes that for students from an engineering background it is an holistic approach to delivering a new way of thinking through a combination of lectures, class activities, assignments, interactions between class members, and access to material elsewhere in the University that enables participants to develop their skills in each of the key themes. Originality /value: The paper provides a reflection on different pedagogical approaches to exploring key sustainable themes and reports students own perceptions of the value of these kinds of activities. Experiences are shared of running a range of diverse learning activities within a professional practice Masters programme.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There are a variety of guidelines and methods available to measure and assess survey quality. Most of these are based on qualitative descriptions. In practice, they are not easy to implement and it is very difficult to make comparisons between surveys. Hence there is a theoretical and pragmatic demand to develop a mainly quantitative based survey assessment tool. This research aimed to meet this need and make contributions to the evaluation and improvement of survey quality. Acknowledging the critical importance of measurement issues in survey research, this thesis starts with a comprehensive introduction to measurement theory and identifies the types of measurement errors associated with measurement procedures through three experiments. Then it moves on to describe concepts, guidelines and methods available for measuring and assessing survey quality. Combining these with measurement principles leads to the development of a quantitative based statistical holistic tool to measure and assess survey quality. The criteria, weights and subweights for the assessment tool are determined using Multi-Criteria Decision-Making (MCDM) and a survey questionnaire based on the Delphi method. Finally the model is applied to a database of surveys which was constructed to develop methods of classification, assessment and improvement of survey quality. The model developed in this thesis enables survey researchers and/or commissioners to make a holistic assessment of the value of the particular survey(s). This model is an Excel based audit which takes a holistic approach, following all stages of the survey from inception, to design, construction, execution, analysis and dissemination. At each stage a set of criteria are applied to assess quality. Scores attained against these assessments are weighted by the importance of the criteria and summed to give an overall assessment of the stage. The total score for a survey can be obtained by a combination of the scores for every stage weighted again by the importance of each stage. The advantage of this is to construct a means of survey assessment which can be used in a diagnostic manner to assess and improve survey quality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

At present the vast majority of Computer-Aided- Engineering (CAE) analysis calculations for microelectronic and microsystems technologies are undertaken using software tools that focus on single aspects of the physics taking place. For example, the design engineer may use one code to predict the airflow and thermal behavior of an electronic package, then another code to predict the stress in solder joints, and then yet another code to predict electromagnetic radiation throughout the system. The reason for this focus of mesh-based codes on separate parts of the governing physics is essentially due to the numerical technologies used to solve the partial differential equations, combined with the subsequent heritage structure in the software codes. Using different software tools, that each requires model build and meshing, leads to a large investment in time, and hence cost, to undertake each of the simulations. During the last ten years there has been significant developments in the modelling community around multi- physics analysis. These developments are being followed by many of the code vendors who are now providing multi-physics capabilities in their software tools. This paper illustrates current capabilities of multi-physics technology and highlights some of the future challenges

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the ‘utilization’ which is defined as the percentage of available ‘seat-hours’ that are employed. Within real institutions, studies have shown that this utilization can often take values as low as 20–40%. One consequence of such a low level of utilization is that space managers are under pressure to make more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. This is accompanied, within space planning (long-term planning) by a lack of experise on how best to accommodate the expected low utilizations. This motivates our two main goals: (i) To understand the factors that drive down utilizations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilizations seen in reality. Furthermore, on considering the decision question ‘Can this given set of courses all be allocated in the available teaching space?’ we find that the answer depends on the associated utilization in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is ‘almost always yes’ and those of ‘almost always no’. Through analysis and understanding of the space of potential solutions, our work suggests that better use of space within universities will come about through an understanding of the effects of timetabling constraints and when it is statistically likely that it will be possible for a set of courses to be allocated to a particular space. The results presented here provide a firm foundation for university managers to take decisions on how space should be managed and planned for more effectively. Our multi-criteria approach and new methodology together provide new insight into the interaction between the course timetabling problem and the crucial issue of space planning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Grass biogas/biomethane has been put forward as a renewable energy solution and it has been shown to perform well in terms of energy balance, greenhouse gas emissions and policy constraints. Biofuel and energy crop solutions are country-specific and grass biomethane has strong potential in countries with temperate climates and a high proportion of grassland, such as Ireland. For a grass biomethane industry to develop in a country, suitable regions (i.e. those with the highest potential) must be identified. In this paper, factors specifically related to the assessment of the potential of a grass biogas/biomethane industry are identified and analysed. The potential for grass biogas and grass biomethane is determined on a county-by-county basis using multi-criteria decision analysis. Values are assigned to each county and ratings and weightings applied to determine the overall county potential. The potential for grass biomethane with co-digestion of slaughter waste (belly grass) is also determined. The county with the highest potential (Limerick) is analysed in detail and is shown to have ready potential for production of gaseous biofuel to meet either 50% of the vehicle fleet or 130% of the domestic natural gas demand, through 25 facilities at a scale of ca. 30ktyr of feedstock. The assessment factors developed in this paper can be used in other resource studies into grass biomethane or other energy crops. © 2010 Elsevier Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, taking advantage of the inclusion of a special module on material deprivation in EU-SILC 2009. we provide a comparative analysis of patterns of deprivation. Our analysis identifies six relatively distinct dimensions of deprivation with generally satisfactory overall levels of reliability and mean levels of reliability across countries. Multi-level analysis based on 28 European countries reveals systematic variation in the importance of within and between country variation for a range of deprivation dimensions. The basic deprivation dimension is the sole dimension to display a graduated pattern of variation across countries. It also reveals the highest correlations with national and household income, the remaining deprivation dimensions and economic stress. It comes closest to capturing an underlying dimension of generalized deprivation that can provide the basis for a comparative European analysis of exclusion from customary standards of living. A multilevel analysis revealed that a range of household characteristics and household reference person socio-economic factors were related to basic deprivation and controlling for contextual differences in such factors allowed us to account for substantial proportions of both within and between country variance. The addition of macro-economic factors relating to average levels of disposable income and income inequality contributed relatively little further in the way of explanatory power. Further analysis revealed the existence of a set of significant interactions between micro socioeconomic attributes and country level gross national disposable income per capita. The impact of socio-economic differentiation was significantly greater where average income levels were lower. Or, in other words, the impact of the latter was greater for more disadvantaged socio-economic groups. Our analysis supports the suggestion that an emphasis on the primary role of income inequality to the neglect of differences in absolute levels of income may be misleading in important respects. (C) 2012 International Sociological Association Research Committee 28 on Social Stratification and Mobility. Published by Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Here we present the first high-resolution multi-proxy analysis of a rich fen in the central-eastern European lowlands. The fen is located in the young glacial landscape of the Sta{ogonek}zki river valley. We investigated the fen's development pathways, asking three main questions: (i) what was the pattern and timing of the peatland's vegetation succession, (ii) how did land use and climate affect the succession in the fen ecosystem, and (iii) to what degree does the reconstructed hydrology for this site correlate with those of other sites in the region in terms of past climate change? Several stages of fen history were determined, beginning with the lake-to-fen transition ca. AD 700. Brown mosses dominated the sampling site from this period to the present. No human impact was found to have occurred until ca. AD 1700, when the first forest cutting began. Around AD 1890 a more significant disturbance took place-this date marks the clear cutting of forests and dramatic landscape openness. Deforestation changed the hydrology and chemistry of the mire, which was revealed by a shift in local plant and testate amoebae communities. We also compared a potential climatic signal recorded in the peat profile before AD 1700 with other sites from the region. © 2013 John Wiley & Sons, Ltd.