894 resultados para Many-to-many-assignment problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Thermodynamic Bethe Ansatz analysis is carried out for the extended-CP^N class of integrable 2-dimensional Non-Linear Sigma Models related to the low energy limit of the AdS_4xCP^3 type IIA superstring theory. The principal aim of this program is to obtain further non-perturbative consistency check to the S-matrix proposed to describe the scattering processes between the fundamental excitations of the theory by analyzing the structure of the Renormalization Group flow. As a noteworthy byproduct we eventually obtain a novel class of TBA models which fits in the known classification but with several important differences. The TBA framework allows the evaluation of some exact quantities related to the conformal UV limit of the model: effective central charge, conformal dimension of the perturbing operator and field content of the underlying CFT. The knowledge of this physical quantities has led to the possibility of conjecturing a perturbed CFT realization of the integrable models in terms of coset Kac-Moody CFT. The set of numerical tools and programs developed ad hoc to solve the problem at hand is also discussed in some detail with references to the code.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with the car sequencing (CS) problem, a combinatorial optimization problem for sequencing mixed-model assembly lines. The aim is to find a production sequence for different variants of a common base product, such that work overload of the respective line operators is avoided or minimized. The variants are distinguished by certain options (e.g., sun roof yes/no) and, therefore, require different processing times at the stations of the line. CS introduces a so-called sequencing rule H:N for each option, which restricts the occurrence of this option to at most H in any N consecutive variants. It seeks for a sequence that leads to no or a minimum number of sequencing rule violations. In this work, CS’ suitability for workload-oriented sequencing is analyzed. Therefore, its solution quality is compared in experiments to the related mixed-model sequencing problem. A new sequencing rule generation approach as well as a new lower bound for the problem are presented. Different exact and heuristic solution methods for CS are developed and their efficiency is shown in experiments. Furthermore, CS is adjusted and applied to a resequencing problem with pull-off tables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

rnThis thesis is on the flavor problem of Randall Sundrum modelsrnand their strongly coupled dual theories. These models are particularly wellrnmotivated extensions of the Standard Model, because they simultaneously address rntherngauge hierarchy problem and the hierarchies in the quarkrnmasses and mixings. In order to put this into context, special attention is given to concepts underlying therntheories which can explain the hierarchy problem and the flavor structure of the Standard Model (SM). ThernAdS/CFTrnduality is introduced and its implications for the Randall Sundrum model withrnfermions in the bulk andrngeneral bulk gauge groups is investigated. It will be shown that the differentrnterms in the general 5D propagator of a bulk gauge field can be related tornthe corresponding diagrams of the strongly coupled dual, which allows for arndeeperrnunderstanding of the origin of flavor changing neutral currents generated by thernexchange of the Kaluza Klein excitations of these bulk fields.rnIn the numerical analysis, different observables which are sensitive torncorrections from therntree-levelrnexchange of these resonances will be presented on the basis of updatedrnexperimental data from the Tevatron and LHC experiments. This includesrnelectroweak precision observables, namely corrections to the S and Trnparameters followed by corrections to the Zbb vertex, flavor changingrnobservables with flavor changes at one vertex, viz. BR (Bd -> mu+mu-) and BR (Bs -> mu+mu-), and two vertices,rn viz. S_psiphi and |eps_K|, as well as bounds from direct detectionrnexperiments. rnThe analysis will show that all of these bounds can be brought in agreement withrna new physics scale Lambda_NP in the TeV range, except for the CPrnviolating quantity |eps_K|, which requires Lambda_NP= Ord(10) TeVrnin the absencernof fine-tuning. The numerous modifications of the Randall Sundrum modelrnin the literature, which try to attenuate this bound are reviewed andrncategorized.rnrnSubsequently, a novel solution to this flavor problem, based on an extendedrncolor gauge group in the bulk and its thorough implementation inrnthe RS model, will be presented, as well as an analysis of the observablesrnmentioned above in the extended model. This solution is especially motivatedrnfromrnthe point of view of the strongly coupled dual theory and the implications forrnstrongly coupled models of new physics, which do not possess a holographic dual,rnare examined.rnFinally, the top quark plays a special role in models with a geometric explanation ofrnflavor hierarchies and the predictions in the Randall-Sundrum model with andrnwithout the proposed extension for the forward-backward asymmetryrnA_FB^trnin top pair production are computed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In business literature, the conflicts among workers, shareholders and the management have been studied mostly in the frame of stakeholder theory. The stakeholder theory recognizes this issue as an agency problem, and tries to solve the problem by establishing a contractual relationship between the agent and principals. However, as Marcoux pointed out, the appropriateness of the contract as a medium to reduce the agency problem should be questioned. As an alternative, the cooperative model minimizes the agency costs by integrating the concept of workers, owners and management. Mondragon Corporation is a successful example of the cooperative model which grew into the sixth largest corporation in Spain. However, the cooperative model has long been ignored in discussions of corporate governance, mainly because the success of the cooperative model is extremely difficult to duplicate in reality. This thesis hopes to revitalize the scholarly examination of cooperatives by developing a new model that overcomes the fundamental problem in the cooperative model: the limited access to capital markets. By dividing the ownership interest into financial and control interest, the dual ownership structure allows cooperatives to issue stock in the capital market by making a financial product out of financial interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining the formation temperature of minerals using fluid inclusions is a crucial step in understanding rock-forming scenarios. Unfortunately, fluid inclusions in minerals formed at low temperature, such as gypsum, are commonly in a metastable monophase liquid state. To overcome this problem, ultra-short laser pulses can be used to induce vapor bubble nucleation, thus creating a stable two-phase fluid inclusion appropriate for subsequent measurements of the liquid-vapor homogenization temperature, T-h. In this study we evaluate the applicability of T-h data to accurately determine gypsum formation temperatures. We used fluid inclusions in synthetic gypsum crystals grown in the laboratory at different temperatures between 40 degrees C and 80 degrees C under atmospheric pressure conditions. We found an asymmetric distribution of the T-h values, which are systematically lower than the actual crystal growth temperatures, T-g; this is due to (1) the effect of surface tension on liquid-vapor homogenization, and (2) plastic deformation of the inclusion walls due to internal tensile stress occurring in the metastable state of the inclusions. Based on this understanding, we have determined growth temperatures of natural giant gypsum crystals from Naica (Mexico), yielding 47 +/- 1.5 degrees C for crystals grown in the Cave of Swords (120 m below surface) and 54.5 +/- 2 degrees C for giant crystals grown in the Cave of Crystals (290 m below surface). These results support the earlier hypothesis that the population and the size of the Naica crystals were controlled by temperature. In addition, this experimental method opens a door to determining the growth temperature of minerals forming in low-temperature environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Avoidance of excessively deep sedation levels is problematic in intensive care patients. Electrophysiologic monitoring may offer an approach to solving this problem. Since electroencephalogram (EEG) responses to different sedation regimens vary, we assessed electrophysiologic responses to two sedative drug regimens in 10 healthy volunteers. Dexmedetomidine/remifentanil (dex/remi group) and midazolam/remifentanil (mida/remi group) were infused 7 days apart. Each combination of medications was given at stepwise intervals to reach Ramsay scores (RS) 2, 3, and 4. Resting EEG, bispectral index (BIS), and the N100 amplitudes of long-latency auditory-evoked potentials (ERP) were recorded at each level of sedation. During dex/remi, resting EEG was characterized by a recurrent high-power low-frequency pattern which became more pronounced at deeper levels of sedation. BIS Index decreased uniformly in only the dex/remi group (from 94 +/- 3 at baseline to 58 +/- 14 at RS 4) compared to the mida/remi group (from 94 +/- 2 to 76 +/- 10; P = 0.029 between groups). The ERP amplitudes decreased from 5.3 +/- 1.3 at baseline to 0.4 +/- 1.1 at RS 4 (P = 0.003) in only the mida/remi group. We conclude that ERPs in volunteers sedated with dex/remi, in contrast to mida/remi, indicate a cortical response to acoustic stimuli, even when sedation reaches deeper levels. Consequently, ERP can monitor sedation with midazolam but not with dexmedetomidine. The reverse is true for BIS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaccines with limited ability to prevent HIV infection may positively impact the HIV/AIDS pandemic by preventing secondary transmission and disease in vaccine recipients who become infected. To evaluate the impact of vaccination on secondary transmission and disease, efficacy trials assess vaccine effects on HIV viral load and other surrogate endpoints measured after infection. A standard test that compares the distribution of viral load between the infected subgroups of vaccine and placebo recipients does not assess a causal effect of vaccine, because the comparison groups are selected after randomization. To address this problem, we formulate clinically relevant causal estimands using the principal stratification framework developed by Frangakis and Rubin (2002), and propose a class of logistic selection bias models whose members identify the estimands. Given a selection model in the class, procedures are developed for testing and estimation of the causal effect of vaccination on viral load in the principal stratum of subjects who would be infected regardless of randomization assignment. We show how the procedures can be used for a sensitivity analysis that quantifies how the causal effect of vaccination varies with the presumed magnitude of selection bias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuel Cells are a promising alternative energy technology. One of the biggest problems that exists in fuel cell is that of water management. A better understanding of wettability characteristics in the fuel cells is needed to alleviate the problem of water management. Contact angle data on gas diffusion layers (GDL) of the fuel cells can be used to characterize the wettability of GDL in fuel cells. A contact angle measurement program has been developed to measure the contact angle of sessile drops from drop images. Digitization of drop images induces pixel errors in the contact angle measurement process. The resulting uncertainty in contact angle measurement has been analyzed. An experimental apparatus has been developed for contact angle measurements at different temperature, with the feature to measure advancing and receding contact angles on gas diffusion layers of fuel cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Writing unit tests for legacy systems is a key maintenance task. When writing tests for object-oriented programs, objects need to be set up and the expected effects of executing the unit under test need to be verified. If developers lack internal knowledge of a system, the task of writing tests is non-trivial. To address this problem, we propose an approach that exposes side effects detected in example runs of the system and uses these side effects to guide the developer when writing tests. We introduce a visualization called Test Blueprint, through which we identify what the required fixture is and what assertions are needed to verify the correct behavior of a unit under test. The dynamic analysis technique that underlies our approach is based on both tracing method executions and on tracking the flow of objects at runtime. To demonstrate the usefulness of our approach we present results from two case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Das operative Torbelegungsproblem (TBP) z. B. an einem Distributions- oder Cross-dockingzentrum ist ein logistisches Problem, bei dem es gilt, an- und abfahrende Fahrzeuge zeitlich und räumlich so auf die Warenein- und -ausgangstore zu verteilen, dass eine mög-lichst kostengünstige Abfertigung ermöglicht wird. Bisherige Arbeiten, die sich mit dem TBP beschäftigen, lassen Aspekte der Kooperation außer Acht. Dieser Beitrag stellt ein Verfahren vor, durch das der Nachteil einseitig optimaler Torbelegungen überwunden werden kann. Dabei wird auf das Mittel der kombinatorischen Auktionen zurückgegriffen und das TBP als Allokationsproblem modelliert, bei dem Frachtführer um Bündel konsekutiver Einheitszeit-intervalle an den Toren konkurrieren. Mittels eines Vickrey-Clarke-Groves-Mechanismus wird einerseits die Anreizkompatibilität, andererseits die individuelle Rationalität des Auk-tionsverfahrens sichergestellt. Das Verfahren wurde in ILOG OPL Studio 3.6.1 implemen-tiert und die durch Testdaten gewonnenen Ergebnisse zeigen, dass die Laufzeiten gering genug sind, um das Verfahren für die operative (kurzfristige) Planung einzusetzen und so transportlogistische Prozesse für alle Beteiligten wirtschaftlicher zu gestalten.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To relate volumetric magnetic resonance imaging (MRI) findings to hypothermia therapy and neurosensory impairments. STUDY DESIGN: Newborns > or =36 weeks' gestation with hypoxic-ischemic encephalopathy who participated in the National Institute of Child Health and Human Development hypothermia randomized trial at our center were eligible. We determined the relationship between hypothermia treatment and usual care (control) to absolute and relative cerebral tissue volumes. Furthermore, we correlated brain volumes with death or neurosensory impairments at 18 to 22 months. RESULT: Both treatment groups were comparable before randomization. Total brain tissue volumes did not differ in relation to treatment assignment. However, relative volumes of subcortical white matter were significantly larger in hypothermia-treated than control infants. Furthermore, relative total brain volumes correlated significantly with death or neurosensory impairments. Relative volumes of the cortical gray and subcortical white matter also correlated significantly with Bayley Scales psychomotor development index. CONCLUSION: Selected volumetric MRI findings correlated with hypothermia therapy and neurosensory impairments. Larger studies using MRI brain volumes as a secondary outcome measure are needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic improvement of native crops is a new and promising strategy to combat hunger in the developing world. Tef is the major staple food crop for approximately 50 million people in Ethiopia. As an indigenous cereal, it is well adapted to diverse climatic and soil conditions; however, its productivity is extremely low mainly due to susceptibility to lodging. Tef has a tall and weak stem, liable to lodge (or fall over), which is aggravated by wind, rain, or application of nitrogen fertilizer. To circumvent this problem, the first semi-dwarf lodging-tolerant tef line, called kegne, was developed from an ethyl methanesulphonate (EMS)-mutagenized population. The response of kegne to microtubule-depolymerizing and -stabilizing drugs, as well as subsequent gene sequencing and segregation analysis, suggests that a defect in the α-Tubulin gene is functionally and genetically tightly linked to the kegne phenotype. In diploid species such as rice, homozygous mutations in α-Tubulin genes result in extreme dwarfism and weak stems. In the allotetraploid tef, only one homeologue is mutated, and the presence of the second intact α-Tubulin gene copy confers the agriculturally beneficial semi-dwarf and lodging-tolerant phenotype. Introgression of kegne into locally adapted and popular tef cultivars in Ethiopia will increase the lodging tolerance in the tef germplasm and, as a result, will improve the productivity of this valuable crop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a real-world staff-assignment problem that was reported to us by a provider of an online workforce scheduling software. The problem consists of assigning employees to work shifts subject to a large variety of requirements related to work laws, work shift compatibility, workload balancing, and personal preferences of employees. A target value is given for each requirement, and all possible deviations from these values are associated with acceptance levels. The objective is to minimize the total number of deviations in ascending order of the acceptance levels. We present an exact lexicographic goal programming MILP formulation and an MILP-based heuristic. The heuristic consists of two phases: in the first phase a feasible schedule is built and in the second phase parts of the schedule are iteratively re-optimized by applying an exact MILP model. A major advantage of such MILP-based approaches is the flexibility to account for additional constraints or modified planning objectives, which is important as the requirements may vary depending on the company or planning period. The applicability of the heuristic is demonstrated for a test set derived from real-world data. Our computational results indicate that the heuristic is able to devise optimal solutions to non-trivial problem instances, and outperforms the exact lexicographic goal programming formulation on medium- and large-sized problem instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.