959 resultados para Linear program model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Written material is often inaccessible fro people with aphasia. The format of written material needs to be adapted to enable people with aphasia to read with understanding. Aims: This study aimed to further explore some issues raised in Rose, Worrall, and MacKenna (2003) concerning the effects of aphasia-friendly formats on the reading comprehension of people with aphasia. It was hypothesised that people with aphasia would comprehend significantly more paragraphs that were formatted in an aphasia-friendly manner than control paragraphs. This study also aimed to investigate if each single aspect of aphasia-friendly formatting (i.e., simplified vocabulary and syntax, large print, increased white spacem and pictures) used in isolation would result in increased comprehension compared to control paragraphs. Other aims were to compare the effect of aphasia-friendly fromatting with the effects of each single adaptation, and to investigate if the effects of aphasia-friendly formates were related to aphasia severity. Methods & Procedures: Participants with mild to moderately severe aphasia (N = 9) read a battery of 90 paragraphs and selected the best word of phrase from a choice of four to complete each paragraph. A linear mixed model (p < .05) was used to analyse the differences in reading comprehension with each paragraph fromat across three reading grade levels. Outcomes & Results: People with aphasia comprehended significantly more aphasia-friendly paragraphs than control paragraphs. They also comprehended significantly more paragraphs with each of the following single adaptations: simplified vocabulary and syntax, large ptint, and increased white spaces. Although people with aphasia tended to comprehend more paragraphs with pictures added than control paragraphs, this difference was not significant. No significant correlation between aphasia severity and the effect of aphasia-friendly formatting was found. Conclusion: This study supports the idea that aphasia-friendly formats increase the reading comprehension of people with aphasia. It suggests that adding pictures, particularly Clip Art pictures, may not significantly improve the reading the reading comprehension of people with aphasia. These findings have implications for all written communication with people with aphasia, both in the clinical setting and in the wider community. Applying these findings may enable people with aphasia to have equal access to written information and to participate in society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visualising data for exploratory analysis is a big challenge in scientific and engineering domains where there is a need to gain insight into the structure and distribution of the data. Typically, visualisation methods like principal component analysis and multi-dimensional scaling are used, but it is difficult to incorporate prior knowledge about structure of the data into the analysis. In this technical report we discuss a complementary approach based on an extension of a well known non-linear probabilistic model, the Generative Topographic Mapping. We show that by including prior information of the covariance structure into the model, we are able to improve both the data visualisation and the model fit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Physical distribution plays an imporant role in contemporary logistics management. Both satisfaction level of of customer and competitiveness of company can be enhanced if the distribution problem is solved optimally. The multi-depot vehicle routing problem (MDVRP) belongs to a practical logistics distribution problem, which consists of three critical issues: customer assignment, customer routing, and vehicle sequencing. According to the literatures, the solution approaches for the MDVRP are not satisfactory because some unrealistic assumptions were made on the first sub-problem of the MDVRP, ot the customer assignment problem. To refine the approaches, the focus of this paper is confined to this problem only. This paper formulates the customer assignment problem as a minimax-type integer linear programming model with the objective of minimizing the cycle time of the depots where setup times are explicitly considered. Since the model is proven to be MP-complete, a genetic algorithm is developed for solving the problem. The efficiency and effectiveness of the genetic algorithm are illustrated by a numerical example.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Group decision making is the study of identifying and selecting alternatives based on the values and preferences of the decision maker. Making a decision implies that there are several alternative choices to be considered. This paper uses the concept of Data Envelopment Analysis to introduce a new mathematical method for selecting the best alternative in a group decision making environment. The introduced model is a multi-objective function which is converted into a multi-objective linear programming model from which the optimal solution is obtained. A numerical example shows how the new model can be applied to rank the alternatives or to choose a subset of the most promising alternatives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The generalised transportation problem (GTP) is an extension of the linear Hitchcock transportation problem. However, it does not have the unimodularity property, which means the linear programming solution (like the simplex method) cannot guarantee to be integer. This is a major difference between the GTP and the Hitchcock transportation problem. Although some special algorithms, such as the generalised stepping-stone method, have been developed, but they are based on the linear programming model and the integer solution requirement of the GTP is relaxed. This paper proposes a genetic algorithm (GA) to solve the GTP and a numerical example is presented to show the algorithm and its efficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is widely supposed that things tend to look blurred when they are moving fast. Previous work has shown that this is true for sharp edges but, paradoxically, blurred edges look sharper when they are moving than when stationary. This is 'motion sharpening'. We show that blurred edges also look up to 50% sharper when they are presented briefly (8-24 ms) than at longer durations (100-500 ms) without motion. This argues strongly against high-level models of sharpening based specifically on compensation for motion blur. It also argues against a recent, low-level, linear filter model that requires motion to produce sharpening. No linear filter model can explain our finding that sharpening was similar for sinusoidal and non-sinusoidal gratings, since linear filters can never distort sine waves. We also conclude that the idea of a 'default' assumption of sharpness is not supported by experimental evidence. A possible source of sharpening is a nonlinearity in the contrast response of early visual mechanisms to fast or transient temporal changes, perhaps based on the magnocellular (M-cell) pathway. Our finding that sharpening is not diminished at low contrast sets strong constraints on the nature of the nonlinearity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context: Subclinical hypothyroidism (SCH) and cognitive dysfunction are both common in the elderly and have been linked. It is important to determine whether T4 replacement therapy in SCH confers cognitive benefit. Objective: Our objective was to determine whether administration of T4 replacement to achieve biochemical euthyroidism in subjects with SCH improves cognitive function. Design and Setting: We conducted a double-blind placebo-controlled randomized controlled trial in the context of United Kingdom primary care. Patients: Ninety-four subjects aged 65 yr and over (57 females, 37 males) with SCH were recruited from a population of 147 identified by screening. Intervention: T4 or placebo was given at an initial dosage of one tablet of either placebo or 25 µg T4 per day for 12 months. Thyroid function tests were performed at 8-weekly intervals with dosage adjusted in one-tablet increments to achieve TSH within the reference range for subjects in treatment arm. Fifty-two subjects received T4 (31 females, 21 males; mean age 73.5 yr, range 65–94 yr); 42 subjects received placebo (26 females, 16 males; mean age 74.2 yr, 66–84 yr). Main Outcome Measures: Mini-Mental State Examination, Middlesex Elderly Assessment of Mental State (covering orientation, learning, memory, numeracy, perception, attention, and language skills), and Trail-Making A and B were administered. Results: Eighty-two percent and 84% in the T4 group achieved euthyroidism at 6- and 12-month intervals, respectively. Cognitive function scores at baseline and 6 and 12 months were as follows: Mini-Mental State Examination T4 group, 28.26, 28.9, and 28.28, and placebo group, 28.17, 27.82, and 28.25 [not significant (NS)]; Middlesex Elderly Assessment of Mental State T4 group, 11.72, 11.67, and 11.78, and placebo group, 11.21, 11.47, and 11.44 (NS); Trail-Making A T4 group, 45.72, 47.65, and 44.52, and placebo group, 50.29, 49.00, and 46.97 (NS); and Trail-Making B T4 group, 110.57, 106.61, and 96.67, and placebo group, 131.46, 119.13, and 108.38 (NS). Linear mixed-model analysis demonstrated no significant changes in any of the measures of cognitive function over time and no between-group difference in cognitive scores at 6 and 12 months. Conclusions: This RCT provides no evidence for treating elderly subjects with SCH with T4 replacement therapy to improve cognitive function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To carry out stability and voltage regulation studies on more electric aircraft systems in which there is a preponderance of multi-pulse, rectifier-fed motor-drive equipment, average dynamic models of the rectifier converters are required. Existing methods are difficult to apply to anything other than single converters with a low pulse number. Therefore an efficient, compact method for deriving the approximate, linear, average model of 6- and 12-pulse rectifiers, based on the assumption of a small duration of the overlap angle is presented. The models are validated against detailed simulations and laboratory prototypes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to investigate cortisol levels as a function of the hypothalamic-pituitary-adrenal axis (HPA) in relation to alexithymia in patients with somatoform disorders (SFD). Diurnal salivary cortisol was sampled in 32 patients with SFD who also underwent a psychiatric examination and filled in questionnaires (Toronto Alexithymia Scale, TAS scale; Screening for Somatoform Symptoms, SOMS scale; Hamilton Depression Scale, HAMD). The mean TAS total score in the sample was 55.69.6, 32% of patients being classified as alexithymic on the basis of their TAS scores. Depression scores were moderate (HAMD=13.2, Beck Depression Inventory, BDI=16.5). The patients' alexithymia scores (TAS scale Difficulty identifying feelings) correlated significantly positively with their somatization scale scores (Symptom Checklist-90 Revised, SCL-90-R); r=0.3438 (P0.05) and their scores on the Global Severity Index (GSI) on the SCL-90-R; r=0.781 (P0.01). Regression analysis was performed with cortisol variables as the dependent variables. Cortisol levels [measured by the area under the curve-ground (AUC-G), area under the curve-increase (AUC-I) and morning cortisol (MCS)] were best predicted in a multiple linear regression model by lower depressive scores (HAMD) and more psychopathological symptoms (SCL-90-R). No significant correlations were found between the patients' alexithymia scores (TAS) and cortisol levels. The healthy control group (n=25) demonstrated significantly higher cortisol levels than did the patients with SFD; in both tests P0.001 for AUC-G and AUC-I. However, the two groups did not differ in terms of their mean morning cortisol levels (P0.05). The results suggest that pre-existing hypocortisolism might possibly be associated with SFD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) as introduced by Charnes, Cooper, and Rhodes (1978) is a linear programming technique that has widely been used to evaluate the relative efficiency of a set of homogenous decision making units (DMUs). In many real applications, the input-output variables cannot be precisely measured. This is particularly important in assessing efficiency of DMUs using DEA, since the efficiency score of inefficient DMUs are very sensitive to possible data errors. Hence, several approaches have been proposed to deal with imprecise data. Perhaps the most popular fuzzy DEA model is based on a-cut. One drawback of the a-cut approach is that it cannot include all information about uncertainty. This paper aims to introduce an alternative linear programming model that can include some uncertainty information from the intervals within the a-cut approach. We introduce the concept of "local a-level" to develop a multi-objective linear programming to measure the efficiency of DMUs under uncertainty. An example is given to illustrate the use of this method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the major challenges in measuring efficiency in terms of resources and outcomes is the assessment of the evolution of units over time. Although Data Envelopment Analysis (DEA) has been applied for time series datasets, DEA models, by construction, form the reference set for inefficient units (lambda values) based on their distance from the efficient frontier, that is, in a spatial manner. However, when dealing with temporal datasets, the proximity in time between units should also be taken into account, since it reflects the structural resemblance among time periods of a unit that evolves. In this paper, we propose a two-stage spatiotemporal DEA approach, which captures both the spatial and temporal dimension through a multi-objective programming model. In the first stage, DEA is solved iteratively extracting for each unit only previous DMUs as peers in its reference set. In the second stage, the lambda values derived from the first stage are fed to a Multiobjective Mixed Integer Linear Programming model, which filters peers in the reference set based on weights assigned to the spatial and temporal dimension. The approach is demonstrated on a real-world example drawn from software development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Firms worldwide are taking major initiatives to reduce the carbon footprint of their supply chains in response to the growing governmental and consumer pressures. In real life, these supply chains face stochastic and non-stationary demand but most of the studies on inventory lot-sizing problem with emission concerns consider deterministic demand. In this paper, we study the inventory lot-sizing problem under non-stationary stochastic demand condition with emission and cycle service level constraints considering carbon cap-and-trade regulatory mechanism. Using a mixed integer linear programming model, this paper aims to investigate the effects of emission parameters, product- and system-related features on the supply chain performance through extensive computational experiments to cover general type business settings and not a specific scenario. Results show that cycle service level and demand coefficient of variation have significant impacts on total cost and emission irrespective of level of demand variability while the impact of product's demand pattern is significant only at lower level of demand variability. Finally, results also show that increasing value of carbon price reduces total cost, total emission and total inventory and the scope of emission reduction by increasing carbon price is greater at higher levels of cycle service level and demand coefficient of variation. The analysis of results helps supply chain managers to take right decision in different demand and service level situations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Permeable reactive barriers (PRB) are constructed from soil solid amendments to support the growth of bacteria that are capable of degrading organic contaminants. The objective of this study was to identify low-cost soil solid amendments that could retard the movement of trichloroethylene (TCE) while serving as long-lived carbon sources to foster its biodegradation in shallow groundwater through the use of a PRB. The natural amendments high in organic carbon content such as eucalyptus mulch, compost, wetland peat, organic humus were compared based on their geophysical characteristics, such as pHw, porosity and total organic carbon (TOC), and as well as TCE sorption potentials. The pHw values were within neutral range except for pine bark mulch and wetland peat. All other geophysical characteristics of the amendments showed suitability for use in a PRB. While the Freundlich model showed better fit for compost and pine bark mulch, the linear sorption model was adequate for eucalyptus mulch, wetland peat and Everglades muck within the concentration range studied (0.2-0.8 mg/L TCE). According to these results, two composts and eucalyptus mulch were selected for laboratory column experiments to evaluate their effectiveness at creating and maintaining conditions suitable for TCE anaerobic dechlorination. The columns were monitored for pH, ORP, TCE degradation, longevity of nutrients and soluble TOC to support TCE dechlorination. Native bacteria in the columns had the ability to convert TCE to DCEs; however, the inoculation with the TCE-degrading culture greatly increased the rate of biodegradation. This caused a significant increase in by-product concentration, mostly in the form of DCEs and VC followed by a slow degradation to ethylene. Of the tested amendments eucalyptus mulch was the most effective at supporting the TCE dechlorination. The experimental results of TCE sequential dechlorination took place in eucalyptus mulch and commercial compost from Savannah River Site columns were then simulated using the Hydrus-1D model. The simulations showed good fit with the experimental data. The results suggested that sorption and degradation were the dominant fate and transport mechanisms for TCE and DCEs in the column, supporting the use of these amendments in a permeable reactive barrier to remediate the TCE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.