863 resultados para Linear Mixed Integer Multicriteria Optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite major progress, currently available treatment options for patients suffering from schizophrenia remain suboptimal. Antipsychotic medication is one such option, and is helpful in acute phases of the disease. However, antipsychotics cause significant side-effects that often require additional medication, and can even trigger the discontinuation of treatment. Taken together, along with the fact that 20-30% of patients are medication-resistant, it is clear that new medical care options should be developed for patients with schizophrenia. Besides medication, an emerging option to treat psychiatric symptoms is through the use of neurofeedback. This technique has proven efficacy for other disorders and, more importantly, has also proven to be feasible in patients with schizophrenia. One of the major advantages of this approach is that it allows for the influence of brain states that otherwise would be inaccessible; i.e. the physiological markers underlying psychotic symptoms. EEG resting-state microstates are a very interesting electrophysiological marker of schizophrenia symptoms. Precisely, a specific class of resting-state microstates, namely microstate class D, has consistently been found to show a temporal shortening in patients with schizophrenia compared to controls, and this shortening is correlated with the presence positive psychotic symptoms. Under the scope of biological psychiatry, appropriate treatment of psychotic symptoms can be expected to modify the underlying physiological markers accompanying behavioral manifestations of a disease. We reason that if abnormal temporal parameters of resting-state microstates seem to be related to positive symptoms in schizophrenia, regulating this EEG feature might be helpful as a treatment for patients. The goal of this thesis was to prove the feasibility of microstate class D contribution self-regulation via neurofeedback. Given that no other study has attempted to regulate microstates via neurofeedback, we first tested its feasibility in a population of healthy subjects. In the first paper we describe the methodological characteristics of the neurofeedback protocol and its implementation. Neurofeedback performance was assessed by means of linear mixed effects modeling, which provided a complete profile of the neurofeedback’s training response within and between-subjects. The protocol included 20 training sessions, and each session contained three conditions: baseline (resting-state) and two active conditions: training (auditory feedback upon self-regulation performance) and transfer (self-regulation with no feedback). With linear modeling we obtained performance indices for each of them as follows: baseline carryover (baseline increments time-dependent) and learning and aptitude for each of the active conditions. Learning refers to the increase/decrease of the microstate class D contribution, time-dependent during each active condition, and aptitude refers to the constant difference of the microstate class D contribution between each active condition and baseline independent of time. The indices provided are discussed in terms of tailoring neurofeedback treatment to individual profiles so that it can be applied in future studies or clinical practice. In our sample of participants, neurofeedback proved feasible, as all participants at least showed positive results in one of the aforementioned learning indices. Furthermore, between-subjects we observed that the contribution of microstate class D across-sessions increased by 0.42% during baseline, 1.93% during training trials, and 1.83% during transfer. This range is expected to be effective in treating psychotic symptoms in patients. In the second paper presented in this thesis, we explored the possible predictors of neurofeedback success among psychological variables measured with questionnaires. An interesting finding was the negative correlation between “motivational incongruence” and some of the neurofeedback performance indices. Even though this finding requires replication, we discuss it in terms of the interfering effects of incompatible psychological processes with neurofeedback training requirements. In the third paper, we present a meta-analysis on all available studies that have related resting-state microstate abnormalities and schizophrenia. We obtained medium effect sizes for two microstate classes, namely C and D. Combining the meta-analysis results with the fact that microstate class D abnormalities are correlated with the presence of positive symptoms in patients with schizophrenia, these results add further support for the training of this precise microstate. Overall, the results obtained in this study encourage the implementation of this protocol in a population of patients with schizophrenia. However, future studies will have to show whether patients will be able to successfully self-regulate the contribution of microstate class D and, if so, whether this regulation will have an impact on symptomatology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Seizures are harmful to the neonatal brain; this compels many clinicians and researchers to persevere further in optimizing every aspects of managing neonatal seizures. Aims To delineate the seizure profile between non-cooled versus cooled neonates with hypoxic-ischaemic encephalopathy (HIE), in neonates with stroke, the response of seizure burden to phenobarbitone and to quantify the degree of electroclinical dissociation (ECD) of seizures. Methods The multichannel video-EEG was used in this research study as the gold standard to detect seizures, allowing accurate quantification of seizure burden to be ascertained in term neonates. The entire EEG recording for each neonate was independently reviewed by at least 1 experienced neurophysiologist. Data were expressed in medians and interquartile ranges. Linear mixed models results were presented as mean (95% confidence interval); p values <0.05 were deemed as significant. Results Seizure burden in cooled neonates was lower than in non-cooled neonates [60(39-224) vs 203(141-406) minutes; p=0.027]. Seizure burden was reduced in cooled neonates with moderate HIE [49(26-89) vs 162(97-262) minutes; p=0.020] when compared with severe HIE. In neonates with stroke, the background pattern showed suppression over the infarcted side and seizures demonstrated a characteristic pattern. Compared with 10 mg/kg, phenobarbitone doses at 20 mg/kg reduced seizure burden (p=0.004). Seizure burden was reduced within 1 hour of phenobarbitone administration [mean (95% confidence interval): -14(-20 to -8) minutes/hour; p<0.001], but seizures returned to pre-treatment levels within 4 hours (p=0.064). The ECD index in cooled, non-cooled neonates with HIE, stroke and in neonates with other diagnoses were 88%, 94%, 64% and 75% respectively. Conclusions Further research exploring the treatment effects on seizure burden in the neonatal brain is required. A change to our current treatment strategy is warranted as we continue to strive for more effective seizure control, anchored with use of the multichannel EEG as the surveillance tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Endemic zoonotic diseases remain a serious but poorly recognised problem in affected communities in developing countries. Despite the overall burden of zoonoses on human and animal health, information about their impacts in endemic settings is lacking and most of these diseases are continuously being neglected. The non-specific clinical presentation of these diseases has been identified as a major challenge in their identification (even with good laboratory diagnosis), and control. The signs and symptoms in animals and humans respectively, are easily confused with other non-zoonotic diseases, leading to widespread misdiagnosis in areas where diagnostic capacity is limited. The communities that are mostly affected by these diseases live in close proximity with their animals which they depend on for livelihood, which further complicates the understanding of the epidemiology of zoonoses. This thesis reviewed the pattern of reporting of zoonotic pathogens that cause febrile illness in malaria endemic countries, and evaluates the recognition of animal associations among other risk factors in the transmission and management of zoonoses. The findings of the review chapter were further investigated through a laboratory study of risk factors for bovine leptospirosis, and exposure patterns of livestock coxiellosis in the subsequent chapters. A review was undertaken on 840 articles that were part of a bigger review of zoonotic pathogens that cause human fever. The review process involves three main steps: filtering and reference classification, identification of abstracts that describe risk factors, and data extraction and summary analysis of data. Abstracts of the 840 references were transferred into a Microsoft excel spread sheet, where several subsets of abstracts were generated using excel filters and text searches to classify the content of each abstract. Data was then extracted and summarised to describe geographical patterns of the pathogens reported, and determine the frequency animal related risk factors were considered among studies that investigated risk factors for zoonotic pathogen transmission. Subsequently, a seroprevalence study of bovine leptospirosis in northern Tanzania was undertaken in the second chapter of this thesis. The study involved screening of serum samples, which were obtained from an abattoir survey and cross-sectional study (Bacterial Zoonoses Project), for antibodies against Leptospira serovar Hardjo. The data were analysed using generalised linear mixed models (GLMMs), to identify risk factors for cattle infection. The final chapter was the analysis of Q fever data, which were also obtained from the Bacterial Zoonoses Project, to determine exposure patterns across livestock species using generalized linear mixed models (GLMMs). Leptospira spp. (10.8%, 90/840) and Rickettsia spp. (10.7%, 86/840) were identified as the most frequently reported zoonotic pathogens that cause febrile illness, while Rabies virus (0.4%, 3/840) and Francisella spp. (0.1%, 1/840) were least reported, across malaria endemic countries. The majority of the pathogens were reported in Asia, and the frequency of reporting seems to be higher in areas where outbreaks are mostly reported. It was also observed that animal related risk factors are not often considered among other risk factors for zoonotic pathogens that cause human fever in malaria endemic countries. The seroprevalence study indicated that Leptospira serovar Hardjo is widespread in cattle population in northern Tanzania, and animal husbandry systems and age are the two most important risk factors that influence seroprevalence. Cattle in the pastoral systems and adult cattle were significantly more likely to be seropositive compared to non-pastoral and young animals respectively, while there was no significant effect of cattle breed or sex. Exposure patterns of Coxiella burnetii appear different for each livestock species. While most risk factors were identified for goats (such as animal husbandry systems, age and sex) and sheep (animal husbandry systems and sex), there were none for cattle. In addition, there was no evidence of a significant influence of mixed livestock-keeping on animal coxiellosis. Zoonotic agents that cause human fever are common in developing countries. The role of animals in the transmission of zoonotic pathogens that cause febrile illness is not fully recognised and appreciated. Since Leptospira spp. and C. burnetii are among the most frequently reported pathogens that cause human fever across malaria endemic countries, and are also prevalent in livestock population, control and preventive measures that recognise animals as source of infection would be very important especially in livestock-keeping communities where people live in close proximity with their animals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The myogenic differentiation 1 gene (MYOD1) has a key role in skeletal muscle differentiation and composition through its regulation of the expression of several muscle-specific genes. We first used a general linear mixed model approach to evaluate the association of MYOD1 expression levels on individual beef tenderness phenotypes. MYOD1 mRNA levels measured by quantitative polymerase chain reactions in 136 Nelore steers were significantly associated (P ? 0.01) with Warner?Bratzler shear force, measured on the longissimus dorsi muscle after 7 and 14 days of beef aging. Transcript abundance for the muscle regulatory gene MYOD1 was lower in animals with more tender beef. We also performed a coexpression network analysis using whole transcriptome sequence data generated from 30 samples of longissimus muscle tissue to identify genes that are potentially regulated by MYOD1. The effect of MYOD1 gene expression on beef tenderness may emerge from its function as an activator of muscle-specific gene transcription such as for the serum response factor (C-fos serum response element-binding transcription factor) gene (SRF), which determines muscle tissue development, composition, growth and maturation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apresentamos uma versão inicial da solução em desenvolvimento para estimação dos efeitos desejados através do modelo animal univariado, utilizando duas abordagens distintas para a obtenção do melhor estimador linear não viesado (BLUP) dos parâmetros do modelo.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Screening of topologies developed by hierarchical heuristic procedures can be carried out by comparing their optimal performance. In this work we will be exploiting mono-objective process optimization using two algorithms, simulated annealing and tabu search, and four different objective functions: two of the net present value type, one of them including environmental costs and two of the global potential impact type. The hydrodealkylation of toluene to produce benzene was used as case study, considering five topologies with different complexities mainly obtained by including or not liquid recycling and heat integration. The performance of the algorithms together with the objective functions was observed, analyzed and discussed from various perspectives: average deviation of results for each algorithm, capacity for producing high purity product, screening of topologies, objective functions robustness in screening of topologies, trade-offs between economic and environmental type objective functions and variability of optimum solutions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A integridade do sinal em sistemas digitais interligados de alta velocidade, e avaliada através da simulação de modelos físicos (de nível de transístor) é custosa de ponto vista computacional (por exemplo, em tempo de execução de CPU e armazenamento de memória), e exige a disponibilização de detalhes físicos da estrutura interna do dispositivo. Esse cenário aumenta o interesse pela alternativa de modelação comportamental que descreve as características de operação do equipamento a partir da observação dos sinais eléctrico de entrada/saída (E/S). Os interfaces de E/S em chips de memória, que mais contribuem em carga computacional, desempenham funções complexas e incluem, por isso, um elevado número de pinos. Particularmente, os buffers de saída são obrigados a distorcer os sinais devido à sua dinâmica e não linearidade. Portanto, constituem o ponto crítico nos de circuitos integrados (CI) para a garantia da transmissão confiável em comunicações digitais de alta velocidade. Neste trabalho de doutoramento, os efeitos dinâmicos não-lineares anteriormente negligenciados do buffer de saída são estudados e modulados de forma eficiente para reduzir a complexidade da modelação do tipo caixa-negra paramétrica, melhorando assim o modelo standard IBIS. Isto é conseguido seguindo a abordagem semi-física que combina as características de formulação do modelo caixa-negra, a análise dos sinais eléctricos observados na E/S e propriedades na estrutura física do buffer em condições de operação práticas. Esta abordagem leva a um processo de construção do modelo comportamental fisicamente inspirado que supera os problemas das abordagens anteriores, optimizando os recursos utilizados em diferentes etapas de geração do modelo (ou seja, caracterização, formulação, extracção e implementação) para simular o comportamento dinâmico não-linear do buffer. Em consequência, contributo mais significativo desta tese é o desenvolvimento de um novo modelo comportamental analógico de duas portas adequado à simulação em overclocking que reveste de um particular interesse nas mais recentes usos de interfaces de E/S para memória de elevadas taxas de transmissão. A eficácia e a precisão dos modelos comportamentais desenvolvidos e implementados são qualitativa e quantitativamente avaliados comparando os resultados numéricos de extracção das suas funções e de simulação transitória com o correspondente modelo de referência do estado-da-arte, IBIS.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Linear programs, or LPs, are often used in optimization problems, such as improving manufacturing efficiency of maximizing the yield from limited resources. The most common method for solving LPs is the Simplex Method, which will yield a solution, if one exists, but over the real numbers. From a purely numerical standpoint, it will be an optimal solution, but quite often we desire an optimal integer solution. A linear program in which the variables are also constrained to be integers is called an integer linear program or ILP. It is the focus of this report to present a parallel algorithm for solving ILPs. We discuss a serial algorithm using a breadth-first branch-and-bound search to check the feasible solution space, and then extend it into a parallel algorithm using a client-server model. In the parallel mode, the search may not be truly breadth-first, depending on the solution time for each node in the solution tree. Our search takes advantage of pruning, often resulting in super-linear improvements in solution time. Finally, we present results from sample ILPs, describe a few modifications to enhance the algorithm and improve solution time, and offer suggestions for future work.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.