907 resultados para Replacement decision optimization model for group scheduling (RDOM-GS)
Resumo:
Grand Unification Theories (GUTs) predict the unification of three of the fundamental forces and are a possible extension of the Standard Model, some of them predict neutrino mass and baryon asymmetry. We consider a minimal non-supersymmetric $SO(10)$ GUT model that can reproduce the observed fermionic masses and mixing parameters of the Standard Model. We calculate the scales of spontaneous symmetry breaking from the GUT to the Standard Model gauge group using two-loop renormalisation group equations. This procedure determines the proton decay rate and the scale of $U(1)_{B-L}$ breaking, which generates cosmic strings, and the right-handed neutrino mass scales. Consequently, the regions of parameter space where thermal leptogenesis is viable are identified and correlated with the fermion masses and mixing, the neutrinoless double beta decay rate, the proton decay rate, and the gravitational wave signal resulting from the network of cosmic strings. We demonstrate that this framework, which can explain the Standard Model fermion masses and mixing and the observed baryon asymmetry, will be highly constrained by the next generation of gravitational wave detectors and neutrino oscillation experiments which will also constrain the proton lifetime
Resumo:
In recent years, global supply chains have increasingly suffered from reliability issues due to various external and difficult to-manage events. The following paper aims to build an integrated approach for the design of a Supply Chain under the risk of disruption and demand fluctuation. The study is divided in two parts: a mathematical optimization model, to identify the optimal design and assignments customer-facility, and a discrete-events simulation of the resulting network. The first one describes a model in which plant location decisions are influenced by variables such as distance to customers, investments needed to open plants and centralization phenomena that help contain the risk of demand variability (Risk Pooling). The entire model has been built with a proactive approach to manage the risk of disruptions assigning to each customer two types of open facilities: one that will serve it under normal conditions and a back-up facility, which comes into operation when the main facility has failed. The study is conducted on a relatively small number of instances due to the computational complexity, a matheuristic approach can be found in part A of the paper to evaluate the problem with a larger set of players. Once the network is built, a discrete events Supply Chain simulation (SCS) has been implemented to analyze the stock flow within the facilities warehouses, the actual impact of disruptions and the role of the back-up facilities which suffer a great stress on their inventory due to a large increase in demand caused by the disruptions. Therefore, simulation follows a reactive approach, in which customers are redistributed among facilities according to the interruptions that may occur in the system and to the assignments deriving from the design model. Lastly, the most important results of the study will be reported, analyzing the role of lead time in a reactive approach for the occurrence of disruptions and comparing the two models in terms of costs.
Resumo:
Abstract Background: GRACE risk score (GS) is a scoring system which has a prognostic significance in patients with non-ST segment elevation myocardial infarction (non-STEMI). Objective: The present study aimed to determine whether end-systolic or end-diastolic epicardial fat thickness (EFT) is more closely associated with high-risk non-STEMI patients according to the GS. Methods: We evaluated 207 patients who had non-STEMI beginning from October 2012 to February 2013, and 162 of them were included in the study (115 males, mean age: 66.6 ± 12.8 years). End-systolic and end-diastolic EFTs were measured with echocardiographic methods. Patients with high in-hospital GS were categorized as the H-GS group (in hospital GS > 140), while other patients were categorized as the low-to-moderate risk group (LM-GS). Results: Systolic and diastolic blood pressures of H-GS patients were lower than those of LM-GS patients, and the average heart rate was higher in this group. End-systolic EFT and end-diastolic EFT were significantly higher in the H-GS group. The echocardiographic assessment of right and left ventricles showed significantly decreased ejection fraction in both ventricles in the H-GS group. The highest correlation was found between GS and end-diastolic EFT (r = 0.438). Conclusion: End-systolic and end-diastolic EFTs were found to be increased in the H-GS group. However, end-diastolic EFT and GS had better correlation than end-systolic EFT and GS.
Resumo:
We aimed to evaluate the effectiveness and safety of bismuth-containing quadruple therapy plus postural change after dosing for Helicobacter pylori eradication in gastrectomized patients. We compared 76 gastric stump patients with H. pylori infection (GS group) with 50 non-gastrectomized H. pylori-positive patients who met the treatment indication (controls). The GS group was divided into GS group 1 and GS group 2. All groups were administered bismuth potassium citrate (220 mg), esomeprazole (20 mg), amoxicillin (1.0 g), and furazolidone (100 mg) twice daily for 14 days. GS group 1 maintained a left lateral horizontal position for 30 min after dosing. H. pylori was detected using rapid urease testing and histologic examination of gastric mucosa before and 3 months after therapy. Mucosal histologic manifestations were evaluated using visual analog scales of the updated Sydney System. GS group 1 had a higher prevalence of eradication than the GS group 2 (intention-to-treat [ITT]: P=0.025; per-protocol [PP]: P=0.030), and the control group had a similar prevalence. GS group 2 had a lower prevalence of eradication than controls (ITT: P=0.006; PP: P=0.626). Scores for chronic inflammation and activity declined significantly (P<0.001) 3 months after treatment, whereas those for atrophy and intestinal metaplasia showed no significant change. Prevalence of adverse reactions was similar among groups during therapy (P=0.939). A bismuth-containing quadruple therapy regimen plus postural change after dosing appears to be a relatively safe, effective, economical, and practical method for H. pylori eradication in gastrectomized patients.
Resumo:
This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.
Resumo:
Accelerated probabilistic modeling algorithms, presenting stochastic local search (SLS) technique, are considered. General algorithm scheme and specific combinatorial optimization method, using “golden section” rule (GS-method), are given. Convergence rates using Markov chains are received. An overview of current combinatorial optimization techniques is presented.
Resumo:
The present paper proposes a flexible consensus scheme for group decision making, which allows one to obtain a consistent collective opinion, from information provided by each expert in terms of multigranular fuzzy estimates. It is based on a linguistic hierarchical model with multigranular sets of linguistic terms, and the choice of the most suitable set is a prerogative of each expert. From the human viewpoint, using such model is advantageous, since it permits each expert to utilize linguistic terms that reflect more adequately the level of uncertainty intrinsic to his evaluation. From the operational viewpoint, the advantage of using such model lies in the fact that it allows one to express the linguistic information in a unique domain, without losses of information, during the discussion process. The proposed consensus scheme supposes that the moderator can interfere in the discussion process in different ways. The intervention can be a request to any expert to update his opinion or can be the adjustment of the weight of each expert`s opinion. An optimal adjustment can be achieved through the execution of an optimization procedure that searches for the weights that maximize a corresponding soft consensus index. In order to demonstrate the usefulness of the presented consensus scheme, a technique for multicriteria analysis, based on fuzzy preference relation modeling, is utilized for solving a hypothetical enterprise strategy planning problem, generated with the use of the Balanced Scorecard methodology. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a decision support tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy resource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric network constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance of the proposed method.
Resumo:
This paper proposes a particle swarm optimization (PSO) approach to support electricity producers for multiperiod optimal contract allocation. The producer risk preference is stated by a utility function (U) expressing the tradeoff between the expectation and variance of the return. Variance estimation and expected return are based on a forecasted scenario interval determined by a price range forecasting model developed by the authors. A certain confidence level is associated to each forecasted scenario interval. The proposed model makes use of contracts with physical (spot and forward) and financial (options) settlement. PSO performance was evaluated by comparing it with a genetic algorithm-based approach. This model can be used by producers in deregulated electricity markets but can easily be adapted to load serving entities and retailers. Moreover, it can easily be adapted to the use of other type of contracts.
Resumo:
Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.
Resumo:
This paper aims to present a multi-agent model for a simulation, whose goal is to help one specific participant of multi-criteria group decision making process.This model has five main intervenient types: the human participant, who is using the simulation and argumentation support system; the participant agents, one associated to the human participant and the others simulating the others human members of the decision meeting group; the directory agent; the proposal agents, representing the different alternatives for a decision (the alternatives are evaluated based on criteria); and the voting agent responsiblefor all voting machanisms.At this stage it is proposed a two phse algorithm. In the first phase each participantagent makes his own evaluation of the proposals under discussion, and the voting agent proposes a simulation of a voting process.In the second phase, after the dissemination of the voting results,each one ofthe partcipan agents will argue to convince the others to choose one of the possible alternatives. The arguments used to convince a specific participant are dependent on agent knowledge about that participant. This two-phase algorithm is applied iteratively.
Resumo:
Knowledge is central to the modern economy and society. Indeed, the knowledge society has transformed the concept of knowledge and is more and more aware of the need to overcome the lack of knowledge when has to make options or address its problems and dilemmas. One’s knowledge is less based on exact facts and more on hypotheses, perceptions or indications. Even when we use new computational artefacts and novel methodologies for problem solving, like the use of Group Decision Support Systems (GDSSs), the question of incomplete information is in most of the situations marginalized. On the other hand, common sense tells us that when a decision is made it is impossible to have a perception of all the information involved and the nature of its intrinsic quality. Therefore, something has to be made in terms of the information available and the process of its evaluation. It is under this framework that a Multi-valued Extended Logic Programming language will be used for knowledge representation and reasoning, leading to a model that embodies the Quality-of-Information (QoI) and its quantification, along the several stages of the decision-making process. In this way, it is possible to provide a measure of the value of the QoI that supports the decision itself. This model will be here presented in the context of a GDSS for VirtualECare, a system aimed at sustaining online healthcare services.
Resumo:
This thesis presents the Fuzzy Monte Carlo Model for Transmission Power Systems Reliability based studies (FMC-TRel) methodology, which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states. This is followed by a remedial action algorithm, based on Optimal Power Flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. For the system states that cause load curtailment, an optimization approach is applied to reduce the probability of occurrence of these states while minimizing the costs to achieve that reduction. This methodology is of most importance for supporting the transmission system operator decision making, namely in the identification of critical components and in the planning of future investments in the transmission power system. A case study based on Reliability Test System (RTS) 1996 IEEE 24 Bus is presented to illustrate with detail the application of the proposed methodology.