984 resultados para Linear Optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following the introduction of single-metal deposition (SMD), a simplified fingermark detection technique based on multimetal deposition, optimization studies were conducted. The different parameters of the original formula were tested and the results were evaluated based on the contrast and overall aspect of the enhanced fingermarks. The new formula for SMD was found based on the most optimized parameters. Interestingly, it was found that important variations from the base parameters did not significantly affect the outcome of the enhancement, thus demonstrating that SMD is a very robust technique. Finally, a comparison of the optimized SMD with multi-metal deposition (MMD) was carried out on different surfaces. It was demonstrated that SMD produces comparable results to MMD, thus validating the technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O problema de otimização de mínimos quadrados e apresentado como uma classe importante de problemas de minimização sem restrições. A importância dessa classe de problemas deriva das bem conhecidas aplicações a estimação de parâmetros no contexto das analises de regressão e de resolução de sistemas de equações não lineares. Apresenta-se uma revisão dos métodos de otimização de mínimos quadrados lineares e de algumas técnicas conhecidas de linearização. Faz-se um estudo dos principais métodos de gradiente usados para problemas não lineares gerais: Métodos de Newton e suas modificações incluindo os métodos Quasi-Newton mais usados (DFP e BFGS). Introduzem-se depois métodos específicos de gradiente para problemas de mínimos quadrados: Gauss-Newton e Levenberg-Larquardt. Apresenta-se uma variedade de exemplos selecionados na literatura para testar os diferentes métodos usando rotinas MATLAB. Faz-se uma an alise comparativa dos algoritmos baseados nesses ensaios computacionais que exibem as vantagens e desvantagens dos diferentes métodos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring and management of intracranial pressure (ICP) and cerebral perfusion pressure (CPP) is a standard of care after traumatic brain injury (TBI). However, the pathophysiology of so-called secondary brain injury, i.e., the cascade of potentially deleterious events that occur in the early phase following initial cerebral insult-after TBI, is complex, involving a subtle interplay between cerebral blood flow (CBF), oxygen delivery and utilization, and supply of main cerebral energy substrates (glucose) to the injured brain. Regulation of this interplay depends on the type of injury and may vary individually and over time. In this setting, patient management can be a challenging task, where standard ICP/CPP monitoring may become insufficient to prevent secondary brain injury. Growing clinical evidence demonstrates that so-called multimodal brain monitoring, including brain tissue oxygen (PbtO2), cerebral microdialysis and transcranial Doppler among others, might help to optimize CBF and the delivery of oxygen/energy substrate at the bedside, thereby improving the management of secondary brain injury. Looking beyond ICP and CPP, and applying a multimodal therapeutic approach for the optimization of CBF, oxygen delivery, and brain energy supply may eventually improve overall care of patients with head injury. This review summarizes some of the important pathophysiological determinants of secondary cerebral damage after TBI and discusses novel approaches to optimize CBF and provide adequate oxygen and energy supply to the injured brain using multimodal brain monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel compressed sensing technique to accelerate the magnetic resonance imaging (MRI) acquisition process. The method, coined spread spectrum MRI or simply s(2)MRI, consists of premodulating the signal of interest by a linear chirp before random k-space under-sampling, and then reconstructing the signal with nonlinear algorithms that promote sparsity. The effectiveness of the procedure is theoretically underpinned by the optimization of the coherence between the sparsity and sensing bases. The proposed technique is thoroughly studied by means of numerical simulations, as well as phantom and in vivo experiments on a 7T scanner. Our results suggest that s(2)MRI performs better than state-of-the-art variable density k-space under-sampling approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The network choice revenue management problem models customers as choosing from an offer-set, andthe firm decides the best subset to offer at any given moment to maximize expected revenue. The resultingdynamic program for the firm is intractable and approximated by a deterministic linear programcalled the CDLP which has an exponential number of columns. However, under the choice-set paradigmwhen the segment consideration sets overlap, the CDLP is difficult to solve. Column generation has beenproposed but finding an entering column has been shown to be NP-hard. In this paper, starting with aconcave program formulation based on segment-level consideration sets called SDCP, we add a class ofconstraints called product constraints, that project onto subsets of intersections. In addition we proposea natural direct tightening of the SDCP called ?SDCP, and compare the performance of both methodson the benchmark data sets in the literature. Both the product constraints and the ?SDCP method arevery simple and easy to implement and are applicable to the case of overlapping segment considerationsets. In our computational testing on the benchmark data sets in the literature, SDCP with productconstraints achieves the CDLP value at a fraction of the CPU time taken by column generation and webelieve is a very promising approach for quickly approximating CDLP when segment consideration setsoverlap and the consideration sets themselves are relatively small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on judgment and decision making presents a confusing picture of human abilities. For example, much research has emphasized the dysfunctional aspects of judgmental heuristics, and yet, other findings suggest that these can be highly effective. A further line of research has modeled judgment as resulting from as if linear models. This paper illuminates the distinctions in these approaches by providing a common analytical framework based on the central theoretical premise that understanding human performance requires specifying how characteristics of the decision rules people use interact with the demands of the tasks they face. Our work synthesizes the analytical tools of lens model research with novel methodology developed to specify the effectiveness of heuristics in different environments and allows direct comparisons between the different approaches. We illustrate with both theoretical analyses and simulations. We further link our results to the empirical literature by a meta-analysis of lens model studies and estimate both human andheuristic performance in the same tasks. Our results highlight the trade-off betweenlinear models and heuristics. Whereas the former are cognitively demanding, the latterare simple to use. However, they require knowledge and thus maps of when andwhich heuristic to employ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O tema da Programação Linear, com as suas particularizações do Problema dos Transportes e do Problema da Afectação de Recursos, é hoje estudado em cursos diversos onde uma disciplina de Investigação Operacional esteja presente. Trata-se, em última análise, de um problema de cálculo de extremos condicionados, seja de máximo ou de mínimo, que apresenta características muito particulares e de grande elegância simbólica. Também os Problemas dos Transportes e da Afectação de Recursos se podem resolver como problemas de Programação Linear, através do Algoritmo Simplex, embora seja preferível o recurso a algoritmos próprios, de muitíssimo maior simplicidade: o Algoritmo dos Transportes e o Algoritmo Húngaro, respectivamente. De molde a facilitar a compreensão do que realmente está em jogo, consideram-se aqui dois casos de determinação de extremos e de extremos condicionados, mas ao nível do final do ensino secundário.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A eficácia da cobertura vegetal morta no controle da erosão pode ser avaliada através de dois indicadores principais: a porcentagem de cobertura do solo pelos resíduos culturais e sua persistência sobre a superfície ao longo do tempo. O preparo do solo, por sua vez, pode exercer influência significativa sobre esses indicadores. O trabalho foi realizado no campo, no município de Eldorado do Sul, Depressão Central do Rio Grande do Sul. Avaliou-se a persistência da cobertura vegetal morta durante um período de pousio, que foi de maio de 1989 a abril de 1990, em sucessão à cultura da soja. Os resíduos dessa cultura foram manejados sem preparo, por escarificação e por gradagem. A porcentagem de cobertura do solo pelos resíduos culturais foi quantificada pelo método fotográfico e pelo da transeção linear. A cultura da soja produziu cobertura vegetal morta em pequena quantidade e de baixa durabilidade. A distribuição dos resíduos na superfície, sem preparo do solo, foi o tratamento que possibilitou melhor correlação (R²) entre os índices de cobertura obtidos pelos dois métodos testados. Nas áreas sob gradagem ou escarificação do solo, os índices de cobertura obtidos pelo método fotográfico foram superiores aos da transeção linear, enquanto, na área sem preparo do solo, houve similaridade entre os resultados dos dois métodos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years there has been an explosive growth in the development of adaptive and data driven methods. One of the efficient and data-driven approaches is based on statistical learning theory (Vapnik 1998). The theory is based on Structural Risk Minimisation (SRM) principle and has a solid statistical background. When applying SRM we are trying not only to reduce training error ? to fit the available data with a model, but also to reduce the complexity of the model and to reduce generalisation error. Many nonlinear learning procedures recently developed in neural networks and statistics can be understood and interpreted in terms of the structural risk minimisation inductive principle. A recent methodology based on SRM is called Support Vector Machines (SVM). At present SLT is still under intensive development and SVM find new areas of application (www.kernel-machines.org). SVM develop robust and non linear data models with excellent generalisation abilities that is very important both for monitoring and forecasting. SVM are extremely good when input space is high dimensional and training data set i not big enough to develop corresponding nonlinear model. Moreover, SVM use only support vectors to derive decision boundaries. It opens a way to sampling optimization, estimation of noise in data, quantification of data redundancy etc. Presentation of SVM for spatially distributed data is given in (Kanevski and Maignan 2004).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crushed seeds of the Moringa oleifera tree have been used traditionally as natural flocculants to clarify drinking water. We previously showed that one of the seed peptides mediates both the sedimentation of suspended particles such as bacterial cells and a direct bactericidal activity, raising the possibility that the two activities might be related. In this study, the conformational modeling of the peptide was coupled to a functional analysis of synthetic derivatives. This indicated that partly overlapping structural determinants mediate the sedimentation and antibacterial activities. Sedimentation requires a positively charged, glutamine-rich portion of the peptide that aggregates bacterial cells. The bactericidal activity was localized to a sequence prone to form a helix-loop-helix structural motif. Amino acid substitution showed that the bactericidal activity requires hydrophobic proline residues within the protruding loop. Vital dye staining indicated that treatment with peptides containing this motif results in bacterial membrane damage. Assembly of multiple copies of this structural motif into a branched peptide enhanced antibacterial activity, since low concentrations effectively kill bacteria such as Pseudomonas aeruginosa and Streptococcus pyogenes without displaying a toxic effect on human red blood cells. This study thus identifies a synthetic peptide with potent antibacterial activity against specific human pathogens. It also suggests partly distinct molecular mechanisms for each activity. Sedimentation may result from coupled flocculation and coagulation effects, while the bactericidal activity would require bacterial membrane destabilization by a hydrophobic loop.