982 resultados para Mixed-integer quadratically-constrained programming
Resumo:
BACKGROUND The rate of avoidable caesarean sections (CS) could be reduced through multifaceted strategies focusing on the involvement of health professionals and compliance with clinical practice guidelines (CPGs). Quality improvements for CS (QICS) programmes (QICS) based on this approach, have been implemented in Canada and Spain. OBJECTIVES Their objectives are as follows: 1) Toto identify clusters in each setting with similar results in terms of cost-consequences, 2) Toto investigate whether demographic, clinical or context characteristics can distinguish these clusters, and 3) Toto explore the implementation of QICS in the 2 regions, in order to identify factors that have been facilitators in changing practices and reducing the use of obstetric intervention, as well as the challenges faced by hospitals in implementing the recommendations. METHODS Descriptive study with a quantitative and qualitative approach. 1) Cluster analysis at patient level with data from 16 hospitals in Quebec (Canada) (n = 105,348) and 15 hospitals in Andalusia (Spain) (n = 64,760). The outcome measures are CS and costs. For the cost, we will consider the intervention, delivery and complications in mother and baby, from the hospital perspective. Cluster analysis will be used to identify participants with similar patterns of CS and costs based, and t tests will be used to evaluate if the clusters differed in terms of characteristics: Hospital level (academic status of hospital, level of care, supply and demand factors), patient level (mother age, parity, gestational age, previous CS, previous pathology, presentation of the baby, baby birth weight). 2) Analysis of in-depth interviews with obstetricians and midwives in hospitals where the QICS were implemented, to explore the differences in delivery-related practices, and the importance of the different constructs for positive or negative adherence to CPGs. Dimensions: political/management level, hospital level, health professionals, mothers and their birth partner. DISCUSSION This work sets out a new approach for programme evaluation, using different techniques to make it possible to take into account the specific context where the programmes were implemented.
Resumo:
In this paper, we address this problem through the design of a semiactive controller based on the mixed H2/H∞ control theory. The vibrations caused by the seismic motions are mitigated by a semiactive damper installed in the bottom of the structure. It is meant by semiactive damper, a device that absorbs but cannot inject energy into the system. Sufficient conditions for the design of a desired control are given in terms of linear matrix inequalities (LMIs). A controller that guarantees asymptotic stability and a mixed H2/H∞ performance is then developed. An algorithm is proposed to handle the semiactive nature of the actuator. The performance of the controller is experimentally evaluated in a real-time hybrid testing facility that consists of a physical specimen (a small-scale magnetorheological damper) and a numerical model (a large-scale three-story building)
Resumo:
Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.
Resumo:
As commonly held, the European Security and Defence Policy (ESDP) suffers from a “double democratic deficit”: the EP has a marginal role in the ESDP-making process and the national parliaments remain unable to account for their own government. Therefore pressure coming from these two institutions had been exercised during the Convention on the Future of Europe to improve the democratic oversight on this rapidly evolving policy. This paper investigates the innovations included in the Constitutional Treaty, focusing specifically on the new role granted to the EP. It shows that even though this text does not substantially modify the inter-institutional balance of powers in the ESDP area, the EP may take advantage of some of its articles to become an actor in the ESDP-control process in the ‘living constitution.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
When a new treatment is compared to an established one in a randomized clinical trial, it is standard practice to statistically test for non-inferiority rather than for superiority. When the endpoint is binary, one usually compares two treatments using either an odds-ratio or a difference of proportions. In this paper, we propose a mixed approach which uses both concepts. One first defines the non-inferiority margin using an odds-ratio and one ultimately proves non-inferiority statistically using a difference of proportions. The mixed approach is shown to be more powerful than the conventional odds-ratio approach when the efficacy of the established treatment is known (with good precision) and high (e.g. with more than 56% of success). The gain of power achieved may lead in turn to a substantial reduction in the sample size needed to prove non-inferiority. The mixed approach can be generalized to ordinal endpoints.
Resumo:
BACKGROUND: Mixed medullary-follicular thyroid carcinoma denotes a rare and heterogeneous group of tumors displaying morphological and immunophenotypical features of both origins within the same lesion. METHOD: We report a case of a 41-year-old woman with a lump in the right side of the neck, increasing in pain and size over several weeks. Serum levels of calcitonine (1140 ng/L) and carcinoembryonic antigen (288 microg/L) were very high. Fine-needle aspiration cytology suggested a diagnosis of medullary thyroid carcinoma. Total thyroidectomy, along with bilateral functional neck and mediastinal lymph-node dissection, were performed. RESULTS: The histopathological examination yielded a diagnosis of medullary carcinoma in the right thyroid lobe, closely intermingled with a nonencapsulated classical papillary carcinoma. One ipsilateral lymph node showed micrometastasis of the medullary counterpart. CONCLUSION: When compared with other cases reported in literature, this particular presentation should be recognized, if required, morphologic and functional criteria are used. The treatment is mostly surgical, driven by the medullary component. The presence of micrometastasis in 1 ipsilateral cervical lymph-node underlines the importance of cervicomediastinal lymph-node dissection and careful searching for metastatic disease.
Resumo:
Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.
Resumo:
Gingival metastases are infrequent and invariably associated with a widespread disease and a poor prognosis. Because of their unremarkable clinical appearance, they can be difficult to distinguish from more common gingival hyperplastic or reactive lesions, such as pyogenic granuloma, peripheral giant cell granuloma, and peripheral ossifying granuloma. We are reporting here an unusual case of a 36-year-old man with a mixed testicular germ cell tumor presenting as a metastatic pure choriocarcinoma involving the maxillary gingiva, extending from the first left premolar to the left second maxillary molar, mimicking a 'benign looking' gingival mass. Gingival metastases may be the first manifestation of a widespread metastatic disease and therefore particular attention must be paid to gingival lesions associated with atypical clinical symptoms and/or signs.
Resumo:
Audit report on Highway Safety Projects administered by The Integer Group Midwest for the year ended September 30, 2006
Resumo:
We perform an experimental test of Maskin's canonical mechanism for Nashimplementation, using 3 subjects in non-repeated groups, as well as 3 outcomes, states of nature, and integer choices. We find that this mechanism succesfully implements the desired outcome a large majority of the time and an imbedded comprehension test indicates that subjects were generally able to comprehend their decision tasks. The performance can also be improved by imposing a fine on non designated dissidents. We offer some explanations for the imperfect implementation, including risk preferences, the possibilities that agents have for collusion, and the mixed strategy equilibria of the game.
Resumo:
Models incorporating more realistic models of customer behavior, as customers choosing froman offer set, have recently become popular in assortment optimization and revenue management.The dynamic program for these models is intractable and approximated by a deterministiclinear program called the CDLP which has an exponential number of columns. However, whenthe segment consideration sets overlap, the CDLP is difficult to solve. Column generationhas been proposed but finding an entering column has been shown to be NP-hard. In thispaper we propose a new approach called SDCP to solving CDLP based on segments and theirconsideration sets. SDCP is a relaxation of CDLP and hence forms a looser upper bound onthe dynamic program but coincides with CDLP for the case of non-overlapping segments. Ifthe number of elements in a consideration set for a segment is not very large (SDCP) can beapplied to any discrete-choice model of consumer behavior. We tighten the SDCP bound by(i) simulations, called the randomized concave programming (RCP) method, and (ii) by addingcuts to a recent compact formulation of the problem for a latent multinomial-choice model ofdemand (SBLP+). This latter approach turns out to be very effective, essentially obtainingCDLP value, and excellent revenue performance in simulations, even for overlapping segments.By formulating the problem as a separation problem, we give insight into why CDLP is easyfor the MNL with non-overlapping considerations sets and why generalizations of MNL posedifficulties. We perform numerical simulations to determine the revenue performance of all themethods on reference data sets in the literature.
Resumo:
Canonical correspondence analysis and redundancy analysis are two methods of constrained ordination regularly used in the analysis of ecological data when several response variables (for example, species abundances) are related linearly to several explanatory variables (for example, environmental variables, spatial positions of samples). In this report I demonstrate the advantages of the fuzzy coding of explanatory variables: first, nonlinear relationships can be diagnosed; second, more variance in the responses can be explained; and third, in the presence of categorical explanatory variables (for example, years, regions) the interpretation of the resulting triplot ordination is unified because all explanatory variables are measured at a categorical level.