980 resultados para Optimal Component Proportions
Resumo:
Component commonality - the use of the same version of a component across multiple products - is being increasingly considered as a promising way to offer high external variety while retaining low internal variety in operations. However, increasing commonality has both positive and negative cost effects, so that optimization approaches are required to identify an optimal commonality level. As components influence to a greater or lesser extent nearly every process step along the supply chain, it is not surprising that a multitude of diverging commonality problems is being investigated in literature, each of which are developing a specific algorithm designed for the respective commonality problem being considered. The paper on hand aims at a general framework which is flexible and efficient enough to be applied to a wide range of commonality problems. Such a procedure based on a two-stage graph approach is presented and tested. Finally, flexibility of the procedure is shown by customizing the framework to account for different types of commonality problems.
Resumo:
The cutoff frequencies of an EMI filter are normally given by the noise attenuation requirements the filter has to fulfill. In order to select the component values of the filter elements, i.e. inductances and capacitances, an additional design criterium is needed. In this paper the effect of the EMI filter input and output impedances are considered. The input impedance influences the filters effect on the system displacement power factor and the output impedance plays a key role in the system stability. The effect of filter element values, the number of filter stages as well as additional damping networks are considered and a design procedure is provided. For this analysis a two-port description of the input filters employing ABCD-parameters is used.
Resumo:
Leakage power consumption is a com- ponent of the total power consumption in data cen- ters that is not traditionally considered in the set- point temperature of the room. However, the effect of this power component, increased with temperature, can determine the savings associated with the careful management of the cooling system, as well as the re- liability of the system. The work presented in this paper detects the need of addressing leakage power in order to achieve substantial savings in the energy consumption of servers. In particular, our work shows that, by a careful detection and management of two working regions (low and high impact of thermal- dependent leakage), energy consumption of the data- center can be optimized by a reduction of the cooling budget.
Resumo:
This paper discusses a model based on the agency theory to analyze the optimal transfer of construction risk in public works contracts. The base assumption is that of a contract between a principal (public authority) and an agent (firm), where the payment mechanism is linear and contains an incentive mechanism to enhance the effort of the agent to reduce construction costs. A theoretical model is proposed starting from a cost function with a random component and assuming that both the public authority and the firm are risk averse. The main outcome of the paper is that the optimal transfer of construction risk will be lower when the variance of errors in cost forecast, the risk aversion of the firm and the marginal cost of public funds are larger, while the optimal transfer of construction risk will grow when the variance of errors in cost monitoring and the risk aversion of the public authority are larger
Resumo:
Piotr Omenzetter and Simon Hoell’s work within the Lloyd’s Register Foundation Centre for Safety and Reliability Engineering at the University of Aberdeen is supported by Lloyd’s Register Foundation. The Foundation helps to protect life and property by supporting engineering-related education, public engagement and the application of research.
Resumo:
Background. The present paper describes a component of a large Population cost-effectiveness study that aimed to identify the averted burden and economic efficiency of current and optimal treatment for the major mental disorders. This paper reports on the findings for the anxiety disorders (panic disorder/agoraphobia, social phobia, generalized anxiety disorder, post-traumatic stress disorder and obsessive-compulsive disorder). Method. Outcome was calculated as averted 'years lived with disability' (YLD), a population summary measure of disability burden. Costs were the direct health care costs in 1997-8 Australian dollars. The cost per YLD averted (efficiency) was calculated for those already in contact with the health system for a mental health problem (current care) and for a hypothetical optimal care package of evidence-based treatment for this same group. Data sources included the Australian National Survey of Mental Health and Well-being and published treatment effects and unit costs. Results. Current coverage was around 40% for most disorders with the exception of social phobia at 21%. Receipt of interventions consistent with evidence-based care ranged from 32% of those in contact with services for social phobia to 64% for post-traumatic stress disorder. The cost of this care was estimated at $400 million, resulting in a cost per YLD averted ranging from $7761 for generalized anxiety disorder to $34 389 for panic/agoraphobia. Under optimal care, costs remained similar but health gains were increased substantially, reducing the cost per YLD to < $20 000 for all disorders. Conclusions. Evidence-based care for anxiety disorders would produce greater population health gain at a similar cost to that of current care, resulting in a substantial increase in the cost-effectiveness of treatment.
Resumo:
The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as O(n3), where n is the sample size. We show that the optimal m-dimensional linear model under a given prior is spanned by the first m eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.
Resumo:
It is indisputable that printed circuit boards (PCBs) play a vital role in our daily lives. With the ever-increasing applications of PCBs, one of the crucial ways to increase a PCB manufacturer’s competitiveness in terms of operation efficiency is to minimize the production time so that the products can be introduced to the market sooner. Optimal Production Planning for PCB Assembly is the first book to focus on the optimization of the PCB assembly lines’ efficiency. This is done by: • integrating the component sequencing and the feeder arrangement problems together for both the pick-and-place machine and the chip shooter machine; • constructing mathematical models and developing an efficient and effective heuristic solution approach for the integrated problems for both types of placement machines, the line assignment problem, and the component allocation problem; and • developing a prototype of the PCB assembly planning system. The techniques proposed in Optimal Production Planning for PCB Assembly will enable process planners in the electronics manufacturing industry to improve the assembly line’s efficiency in their companies. Graduate students in operations research can familiarise themselves with the techniques and the applications of mathematical modeling after reading this advanced introduction to optimal production planning for PCB assembly.
Resumo:
This paper focuses on minimizing printed circuit board (PCB) assembly time for a chipshootermachine, which has a movable feeder carrier holding components, a movable X–Y table carrying a PCB, and a rotary turret with multiple assembly heads. The assembly time of the machine depends on two inter-related optimization problems: the component sequencing problem and the feeder arrangement problem. Nevertheless, they were often regarded as two individual problems and solved separately. This paper proposes two complete mathematical models for the integrated problem of the machine. The models are verified by two commercial packages. Finally, a hybrid genetic algorithm previously developed by the authors is presented to solve the model. The algorithm not only generates the optimal solutions quickly for small-sized problems, but also outperforms the genetic algorithms developed by other researchers in terms of total assembly time.
Resumo:
A chip shooter machine for electronic component assembly has a movable feeder carrier, a movable X–Y table carrying a printed circuit board (PCB), and a rotary turret with multiple assembly heads. This paper presents a hybrid genetic algorithm (HGA) to optimize the sequence of component placements and the arrangement of component types to feeders simultaneously for a chip shooter machine, that is, the component scheduling problem. The objective of the problem is to minimize the total assembly time. The GA developed in the paper hybridizes different search heuristics including the nearest-neighbor heuristic, the 2-opt heuristic, and an iterated swap procedure, which is a new improved heuristic. Compared with the results obtained by other researchers, the performance of the HGA is superior in terms of the assembly time. Scope and purpose When assembling the surface mount components on a PCB, it is necessary to obtain the optimal sequence of component placements and the best arrangement of component types to feeders simultaneously in order to minimize the total assembly time. Since it is very difficult to obtain the optimality, a GA hybridized with several search heuristics is developed. The type of machines being studied is the chip shooter machine. This paper compares the algorithm with a simple GA. It shows that the performance of the algorithm is superior to that of the simple GA in terms of the total assembly time.
Resumo:
With new and emerging e-business technologies to transform business processes, it is important to understand how those technologies will affect the performance of a business. Will the overall business process be cheaper, faster and more accurate or will a sub-optimal change have been implemented? The use of simulation to model the behaviour of business processes is well established, and it has been applied to e-business processes to understand their performance in terms of measures such as lead-time, cost and responsiveness. This paper introduces the concept of simulation components that enable simulation models of e-business processes to be built quickly from generic e-business templates. The paper demonstrates how these components were devised, as well as the results from their application through case studies.
Resumo:
Objective: To independently evaluate the impact of the second phase of the Health Foundation's Safer Patients Initiative (SPI2) on a range of patient safety measures. Design: A controlled before and after design. Five substudies: survey of staff attitudes; review of case notes from high risk (respiratory) patients in medical wards; review of case notes from surgical patients; indirect evaluation of hand hygiene by measuring hospital use of handwashing materials; measurement of outcomes (adverse events, mortality among high risk patients admitted to medical wards, patients' satisfaction, mortality in intensive care, rates of hospital acquired infection). Setting: NHS hospitals in England. Participants: Nine hospitals participating in SPI2 and nine matched control hospitals. Intervention The SPI2 intervention was similar to the SPI1, with somewhat modified goals, a slightly longer intervention period, and a smaller budget per hospital. Results: One of the scores (organisational climate) showed a significant (P=0.009) difference in rate of change over time, which favoured the control hospitals, though the difference was only 0.07 points on a five point scale. Results of the explicit case note reviews of high risk medical patients showed that certain practices improved over time in both control and SPI2 hospitals (and none deteriorated), but there were no significant differences between control and SPI2 hospitals. Monitoring of vital signs improved across control and SPI2 sites. This temporal effect was significant for monitoring the respiratory rate at both the six hour (adjusted odds ratio 2.1, 99% confidence interval 1.0 to 4.3; P=0.010) and 12 hour (2.4, 1.1 to 5.0; P=0.002) periods after admission. There was no significant effect of SPI for any of the measures of vital signs. Use of a recommended system for scoring the severity of pneumonia improved from 1.9% (1/52) to 21.4% (12/56) of control and from 2.0% (1/50) to 41.7% (25/60) of SPI2 patients. This temporal change was significant (7.3, 1.4 to 37.7; P=0.002), but the difference in difference was not significant (2.1, 0.4 to 11.1; P=0.236). There were no notable or significant changes in the pattern of prescribing errors, either over time or between control and SPI2 hospitals. Two items of medical history taking (exercise tolerance and occupation) showed significant improvement over time, across both control and SPI2 hospitals, but no additional SPI2 effect. The holistic review showed no significant changes in error rates either over time or between control and SPI2 hospitals. The explicit case note review of perioperative care showed that adherence rates for two of the four perioperative standards targeted by SPI2 were already good at baseline, exceeding 94% for antibiotic prophylaxis and 98% for deep vein thrombosis prophylaxis. Intraoperative monitoring of temperature improved over time in both groups, but this was not significant (1.8, 0.4 to 7.6; P=0.279), and there were no additional effects of SPI2. A dramatic rise in consumption of soap and alcohol hand rub was similar in control and SPI2 hospitals (P=0.760 and P=0.889, respectively), as was the corresponding decrease in rates of Clostridium difficile and meticillin resistant Staphylococcus aureus infection (P=0.652 and P=0.693, respectively). Mortality rates of medical patients included in the case note reviews in control hospitals increased from 17.3% (42/243) to 21.4% (24/112), while in SPI2 hospitals they fell from 10.3% (24/233) to 6.1% (7/114) (P=0.043). Fewer than 8% of deaths were classed as avoidable; changes in proportions could not explain the divergence of overall death rates between control and SPI2 hospitals. There was no significant difference in the rate of change in mortality in intensive care. Patients' satisfaction improved in both control and SPI2 hospitals on all dimensions, but again there were no significant changes between the two groups of hospitals. Conclusions: Many aspects of care are already good or improving across the NHS in England, suggesting considerable improvements in quality across the board. These improvements are probably due to contemporaneous policy activities relating to patient safety, including those with features similar to the SPI, and the emergence of professional consensus on some clinical processes. This phenomenon might have attenuated the incremental effect of the SPI, making it difficult to detect. Alternatively, the full impact of the SPI might be observable only in the longer term. The conclusion of this study could have been different if concurrent controls had not been used.
Resumo:
The surface epithelial cells of the stomach represent a major component of the gastric barrier. A cell culture model of the gastric epithelial cell surface would prove useful for biopharmaceutical screening of new chemical entities and dosage forms. Primary cultures of guinea pig gastric mucous epithelial cells were grown on filter inserts (Transwells®) for 3 days. Tight-junction formation, assessed by transepithelial electrical resistance (TEER) and permeability of mannitol and fluorescein, was enhanced when collagen IV rather than collagen I was used to coat the polycarbonate filter. TEER for cells grown on collagen IV was close to that obtained with intact guinea pig gastric epithelium in vitro. Differentiation was assessed by incorporation of [ 3H]glucosamine into glycoprotein and by activity of NADPH oxidase, which produces superoxide. Both of these measures were greater for cells grown on filters coated with collagen I than for cells grown on plastic culture plates, but no major difference was found between cells grown on collagens I and IV. The proportion of cells, which stained positively for mucin with periodic acid Schiff reagent, was greater than 95% for all culture conditions. Monolayers grown on membranes coated with collagen IV exhibited apically polarized secretion of mucin and superoxide, and were resistant to acidification of the apical medium to pH 3.0 for 30 min. A screen of nonsteroidal anti-inflammatory drugs revealed a novel effect of diclofenac and niflumic acid in reversibly reducing permeability by the paracellular route. In conclusion, the mucous cell preparation grown on collagen IV represents a good model of the gastric surface epithelium suitable for screening procedures. © 2005 The Society for Biomolecular Screening.
Resumo:
We evaluated inter-individual variability in optimal current direction for biphasic transcranial magnetic stimulation (TMS) of the motor cortex. Motor threshold for first dorsal interosseus was detected visually at eight coil orientations in 45° increments. Each participant (n = 13) completed two experimental sessions. One participant with low test–retest correlation (Pearson's r < 0.5) was excluded. In four subjects, visual detection of motor threshold was compared to EMG detection; motor thresholds were very similar and highly correlated (0.94–0.99). Similar with previous studies, stimulation in the majority of participants was most effective when the first current pulse flowed towards postero-lateral in the brain. However, in four participants, the optimal coil orientation deviated from this pattern. A principal component analysis using all eight orientations suggests that in our sample the optimal orientation of current direction was normally distributed around the postero-lateral orientation with a range of 63° (S.D. = 13.70°). Whenever the intensity of stimulation at the target site is calculated as a percentage from the motor threshold, in order to minimize intensity and side-effects it may be worthwhile to check whether rotating the coil 45° from the traditional posterior–lateral orientation decreases motor threshold.
Resumo:
Piotr Omenzetter and Simon Hoell’s work within the Lloyd’s Register Foundation Centre for Safety and Reliability Engineering at the University of Aberdeen is supported by Lloyd’s Register Foundation. The Foundation helps to protect life and property by supporting engineering-related education, public engagement and the application of research.