941 resultados para Multi-Criteria Optimisation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

High Energy efficiency and high performance are the key regiments for Internet of Things (IoT) end-nodes. Exploiting cluster of multiple programmable processors has recently emerged as a suitable solution to address this challenge. However, one of the main bottlenecks for multi-core architectures is the instruction cache. While private caches fall into data replication and wasting area, fully shared caches lack scalability and form a bottleneck for the operating frequency. Hence we propose a hybrid solution where a larger shared cache (L1.5) is shared by multiple cores connected through a low-latency interconnect to small private caches (L1). However, it is still limited by large capacity miss with a small L1. Thus, we propose a sequential prefetch from L1 to L1.5 to improve the performance with little area overhead. Moreover, to cut the critical path for better timing, we optimized the core instruction fetch stage with non-blocking transfer by adopting a 4 x 32-bit ring buffer FIFO and adding a pipeline for the conditional branch. We present a detailed comparison of different instruction cache architectures' performance and energy efficiency recently proposed for Parallel Ultra-Low-Power clusters. On average, when executing a set of real-life IoT applications, our two-level cache improves the performance by up to 20% and loses 7% energy efficiency with respect to the private cache. Compared to a shared cache system, it improves performance by up to 17% and keeps the same energy efficiency. In the end, up to 20% timing (maximum frequency) improvement and software control enable the two-level instruction cache with prefetch adapt to various battery-powered usage cases to balance high performance and energy efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research of advanced technologies for energy generation contemplates a series of alternatives that are introduced both in the investigation of new energy sources and in the improvement and/or development of new components and systems. Even though significant reductions are observed in the amount of emissions, the proposed alternatives require the use of exhaust gases cleaning systems. The results of environmental analyses based on two configurations proposed for urban waste incineration are presented in this paper; the annexation of integer (Boolean) variables to the environomic model makes it possible to define the best gas cleaning routes based on exergetic cost minimisation criteria. In this first part, the results for steam cogeneration system analysis associated with the incineration of municipal solid wastes (MSW) is presented. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the first paper of this paper (Part I), conditions were presented for the gas cleaning technological route for environomic optimisation of a cogeneration system based in a thermal cycle with municipal solid waste incineration. In this second part, an environomic analysis is presented of a cogeneration system comprising a combined cycle composed of a gas cycle burning natural gas with a heat recovery steam generator with no supplementary burning and a steam cycle burning municipal solid wastes (MSW) to which will be added a pure back pressure steam turbine (another one) of pure condensation. This analysis aims to select, concerning some scenarios, the best atmospheric pollutant emission control routes (rc) according to the investment cost minimisation, operation and social damage criteria. In this study, a comparison is also performed with the results obtained in the Case Study presented in Part I. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reconstruction of power industries has brought fundamental changes to both power system operation and planning. This paper presents a new planning method using multi-objective optimization (MOOP) technique, as well as human knowledge, to expand the transmission network in open access schemes. The method starts with a candidate pool of feasible expansion plans. Consequent selection of the best candidates is carried out through a MOOP approach, of which multiple objectives are tackled simultaneously, aiming at integrating the market operation and planning as one unified process in context of deregulated system. Human knowledge has been applied in both stages to ensure the selection with practical engineering and management concerns. The expansion plan from MOOP is assessed by reliability criteria before it is finalized. The proposed method has been tested with the IEEE 14-bus system and relevant analyses and discussions have been presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aims of this study were to analyze the criterion and construct validity of Part II of the protocol for multi-professional centers for the determination of signs and symptoms of temporomandibular disorders (ProTMDMulti) as a measure of TMD severity. The study was conducted on eight asymptomatic subjects (CG) and 30 subjects with articular TMD (TMDG), according to the Research Diagnostic Criteria for TMD (RDC/TMD). The ProTMDMulti-Part II was validated using the Helkimo Clinical Dysfunction Index (Di). The construct validity was tested using the analysis of the ability of ProTMDMulti-part II to differentiate the CG from the TMDG and to measure the changes that occurred in the TMDG between the period before and after TMD treatment. Correlations between the Di and the ProTMDMulti-Part II scores were calculated using the Spearman test. Inter- and intragroup comparisons were made (p<0.05). There was a statistically significant correlation between the Helkimo Clinical Dysfunction Index (Di) and the severity scores of the ProTMDMulti-Part II. There was a significant difference between TMDG and CG regarding the severity of signs and symptoms. The present study provides statistical evidence of the clinical validity of the ProTMDmulti-Part II as a measure of the severity of TMD symptoms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major challenge faced by today's white clover breeder is how to manage resources within a breeding program. It is essential to utilise these resources with sufficient flexibility to build on past progress from conventional breeding strategies, but also take advantage of emerging opportunities from molecular breeding tools such as molecular markers and transformation. It is timely to review white clover breeding strategies. This background can then be used as a foundation for considering how to continue conventional plant improvement activities and complement them with molecular breeding opportunities. In this review, conventional white clover breeding strategies relevant to the Australian dryland target population environments are considered. Attention is given to: (i) availability of genetic variation, (ii) characterisation of germplasm collections, (iii) quantitative models for estimation of heritability, (iv) the role of multi-environment trials to accommodate genotype-by-environment interactions, (v) interdisciplinary research to understand adaptation to dryland environments, (vi) breeding and selection strategies, and (vii) cultivar structure. Current achievements in biotechnology with specific reference to white clover breeding in Australia are considered, and computer modelling of breeding programs is discussed as a useful integrative tool for the joint evaluation of conventional and molecular breeding strategies and optimisation of resource use in breeding programs. Four areas are identified as future research priorities: (i) capturing the potential genetic diversity among introduced accessions and ecotypes that are adapted to key constraints such as summer moisture stress and the use of molecular markers to assess the genetic diversity, (ii) understanding the underlying physiological/morphological root and shoot mechanisms involved in water use efficiency of white clover, with the objective of identifying appropriate selection criteria, (iii) estimation of quantitative genetic parameters of important morphological/physiological attributes to enable prediction of response to selection in target environments, and (iv) modelling white clover breeding strategies to evaluate the opportunities for integration of molecular breeding strategies with conventional breeding programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Mar (Biologia Marinha)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High transverse momentum jets produced in pp collisions at a centre of mass energy of 7 TeV are used to measure the transverse energy--energy correlation function and its associated azimuthal asymmetry. The data were recorded with the ATLAS detector at the LHC in the year 2011 and correspond to an integrated luminosity of 158 pb−1. The selection criteria demand the average transverse momentum of the two leading jets in an event to be larger than 250 GeV. The data at detector level are well described by Monte Carlo event generators. They are unfolded to the particle level and compared with theoretical calculations at next-to-leading-order accuracy. The agreement between data and theory is good and provides a precision test of perturbative Quantum Chromodynamics at large momentum transfers. From this comparison, the strong coupling constant given at the Z boson mass is determined to be αs(mZ)=0.1173±0.0010 (exp.) +0.0065−0.0026 (theo.).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and objective: Optimal care of diabetic patients (DPs) decreases the risk of complications. Close blood glucose monitoring can improve patient outcomes and shorten hospital stay. The objective of this pilot study was to evaluate the treatment of hospitalized DPs according to the current standards, including their diabetic treatment and drugs to prevent diabetes related complications [=guardian drugs: angiotensin converting enzyme inhibitors (ACEI) or Angiotensin II Receptor Blockers (ARB), antiplatelet drugs, statins]. Guidelines of the American Diabetes Association (ADA) [1] were used as reference as they were the most recent and exhaustive for hospital care. Design: Observational pilot study: analysis of the medical records of all DPs seen by the clinical pharmacists during medical rounds in different hospital units. An assessment was made by assigning points for fulfilling the different criteria according to ADA and then by dividing the total by the maximum achievable points (scale 0-1; 1 = all criteria fulfilled). Setting: Different Internal Medicine and Geriatric Units of the (multi-site) Ho^pital du Valais. Main outcome measures: - Completeness of diabetes-related information: type of diabetes, medical history, weight, albuminuria status, renal function, blood pressure, (recent) lipid profile. - Management of blood glucose: Hb1Ac, glycemic control, plan for treating hyper-/hypoglycaemia. - Presence of guardian drugs if indicated. Results: Medical records of 42 patients in 10 different units were analysed (18 women, 24 men, mean age 75.4 ± 11 years). 41 had type 2 diabetes. - Completeness of diabetes-related information: 0.8 ± 0.1. Information often missing: insulin-dependence (43%) and lipid profile (86%). - Management of blood glucose: 0.5 ± 0.2. 15 patients had suboptimal glycemic balance (target glycaemia 7.2-11.2 mmol/ l, with values[11.2 or\3.8 mmol/l, or Hb1Ac[7%), 10 patients had a deregulated balance (more than 10 values[11.2 mmol/l or \3.8 mmol/l and even values[15 mmol/l). - Presence of guardian drugs if indicated: ACEI/ARB: 19 of 23 patients (82.6%), statin: 16 of 40 patients (40%), antiplatelet drug: 16 of 39 patients (41%). Conclusions: Blood glucose control was insufficient in many DPs and prescription of statins and antiplatelet drugs was often missing. If confirmed by a larger study, these two points need to be optimised. As it is not always possible and appropriate to make those changes during hospital stay, a further project should assess and optimise diabetes care across both inpatient and outpatient settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identity [r]evolution is happening. Who are you, who am I in the information society? In recent years, the convergence of several factors - technological, political, economic - has accelerated a fundamental change in our networked world. On a technological level, information becomes easier to gather, to store, to exchange and to process. The belief that more information brings more security has been a strong political driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one's behaviour, or needs, or preferences. It can lead to categorizations according to some specific risk criteria, for example, or to direct and personalized marketing. As a consequence, new forms of identities appear. They are not necessarily related to our names anymore. They are based on information, on traces that we leave when we act or interact, when we go somewhere or just stay in one place, or even sometimes when we make a choice. They are related to the SIM cards of our mobile phones, to our credit card numbers, to the pseudonyms that we use on the Internet, to our email addresses, to the IP addresses of our computers, to our profiles... Like traditional identities, these new forms of identities can allow us to distinguish an individual within a group of people, or describe this person as belonging to a community or a category. How far have we moved through this process? The identity [r]evolution is already becoming part of our daily lives. People are eager to share information with their "friends" in social networks like Facebook, in chat rooms, or in Second Life. Customers take advantage of the numerous bonus cards that are made available. Video surveillance is becoming the rule. In several countries, traditional ID documents are being replaced by biometric passports with RFID technologies. This raises several privacy issues and might actually even result in changing the perception of the concept of privacy itself, in particular by the younger generation. In the information society, our (partial) identities become the illusory masks that we choose -or that we are assigned- to interplay and communicate with each other. Rights, obligations, responsibilities, even reputation are increasingly associated with these masks. On the one hand, these masks become the key to access restricted information and to use services. On the other hand, in case of a fraud or negative reputation, the owner of such a mask can be penalized: doors remain closed, access to services is denied. Hence the current preoccupying growth of impersonation, identity-theft and other identity-related crimes. Where is the path of the identity [r]evolution leading us? The booklet is giving a glance on possible scenarios in the field of identity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global warming mitigation has recently become a priority worldwide. A large body of literature dealing with energy related problems has focused on reducing greenhouse gases emissions at an engineering scale. In contrast, the minimization of climate change at a wider macroeconomic level has so far received much less attention. We investigate here the issue of how to mitigate global warming by performing changes in an economy. To this end, we make use of a systematic tool that combines three methods: linear programming, environmentally extended input output models, and life cycle assessment principles. The problem of identifying key economic sectors that contribute significantly to global warming is posed in mathematical terms as a bi criteria linear program that seeks to optimize simultaneously the total economic output and the total life cycle CO2 emissions. We have applied this approach to the European Union economy, finding that significant reductions in global warming potential can be attained by regulating specific economic sectors. Our tool is intended to aid policymakers in the design of more effective public policies for achieving the environmental and economic targets sought.