416 resultados para Central cost
Resumo:
The cost effectiveness of antimicrobial stewardship (AMS) programmes was reviewed in hospital settings of Organisation for Economic Co-operation and Development (OECD) countries, and limited to adult patient populations. In each of the 36 studies, the type of AMS strategy and the clinical and cost outcomes were evaluated. The main AMS strategy implemented was prospective audit with intervention and feedback (PAIF), followed by the use of rapid technology, including rapid polymerase chain reaction (PCR)-based methods and matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF) technology, for the treatment of bloodstream infections. All but one of the 36 studies reported that AMS resulted in a reduction in pharmacy expenditure. Among 27 studies measuring changes to health outcomes, either no change was reported post-AMS, or the additional benefits achieved from these outcomes were not quantified. Only two studies performed a full economic evaluation: one on a PAIF-based AMS intervention; and the other on use of rapid technology for the selection of appropriate treatment for serious Staphylococcus aureus infections. Both studies found the interventions to be cost effective. AMS programmes achieved a reduction in pharmacy expenditure, but there was a lack of consistency in the reported cost outcomes making it difficult to compare between interventions. A failure to capture complete costs in terms of resource use makes it difficult to determine the true cost of these interventions. There is an urgent need for full economic evaluations that compare relative changes both in clinical and cost outcomes to enable identification of the most cost-effective AMS strategies in hospitals.
Resumo:
- Background Exercise referral schemes (ERS) aim to identify inactive adults in the primary-care setting. The GP or health-care professional then refers the patient to a third-party service, with this service taking responsibility for prescribing and monitoring an exercise programme tailored to the needs of the individual. - Objective To assess the clinical effectiveness and cost-effectiveness of ERS for people with a diagnosed medical condition known to benefit from physical activity (PA). The scope of this report was broadened to consider individuals without a diagnosed condition who are sedentary. - Data sources MEDLINE; EMBASE; PsycINFO; The Cochrane Library, ISI Web of Science; SPORTDiscus and ongoing trial registries were searched (from 1990 to October 2009) and included study references were checked. - Methods Systematic reviews: the effectiveness of ERS, predictors of ERS uptake and adherence, and the cost-effectiveness of ERS; and the development of a decision-analytic economic model to assess cost-effectiveness of ERS. - Results Seven randomised controlled trials (UK, n = 5; non-UK, n = 2) met the effectiveness inclusion criteria, five comparing ERS with usual care, two compared ERS with an alternative PA intervention, and one to an ERS plus a self-determination theory (SDT) intervention. In intention-to-treat analysis, compared with usual care, there was weak evidence of an increase in the number of ERS participants who achieved a self-reported 90-150 minutes of at least moderate-intensity PA per week at 6-12 months' follow-up [pooled relative risk (RR) 1.11, 95% confidence interval 0.99 to 1.25]. There was no consistent evidence of a difference between ERS and usual care in the duration of moderate/vigorous intensity and total PA or other outcomes, for example physical fitness, serum lipids, health-related quality of life (HRQoL). There was no between-group difference in outcomes between ERS and alternative PA interventions or ERS plus a SDT intervention. None of the included trials separately reported outcomes in individuals with medical diagnoses. Fourteen observational studies and five randomised controlled trials provided a numerical assessment of ERS uptake and adherence (UK, n = 16; non-UK, n = 3). Women and older people were more likely to take up ERS but women, when compared with men, were less likely to adhere. The four previous economic evaluations identified suggest ERS to be a cost-effective intervention. Indicative incremental cost per quality-adjusted life-year (QALY) estimates for ERS for various scenarios were based on a de novo model-based economic evaluation. Compared with usual care, the mean incremental cost for ERS was £169 and the mean incremental QALY was 0.008, with the base-case incremental cost-effectiveness ratio at £20,876 per QALY in sedentary people without a medical condition and a cost per QALY of £14,618 in sedentary obese individuals, £12,834 in sedentary hypertensive patients, and £8414 for sedentary individuals with depression. Estimates of cost-effectiveness were highly sensitive to plausible variations in the RR for change in PA and cost of ERS. - Limitations We found very limited evidence of the effectiveness of ERS. The estimates of the cost-effectiveness of ERS are based on a simple analytical framework. The economic evaluation reports small differences in costs and effects, and findings highlight the wide range of uncertainty associated with the estimates of effectiveness and the impact of effectiveness on HRQoL. No data were identified as part of the effectiveness review to allow for adjustment of the effect of ERS in different populations. - Conclusions There remains considerable uncertainty as to the effectiveness of ERS for increasing activity, fitness or health indicators or whether they are an efficient use of resources in sedentary people without a medical diagnosis. We failed to identify any trial-based evidence of the effectiveness of ERS in those with a medical diagnosis. Future work should include randomised controlled trials assessing the cinical effectiveness and cost-effectivenesss of ERS in disease groups that may benefit from PA. - Funding The National Institute for Health Research Health Technology Assessment programme.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.
Resumo:
Background There has been considerable publicity regarding population ageing and hospital emergency department (ED) overcrowding. Our study aims to investigate impact of one intervention piloted in Queensland Australia, the Hospital in the Nursing Home (HiNH) program, on reducing ED and hospital attendances from residential aged care facilities (RACFs). Methods A quasi-experimental study was conducted at an intervention hospital undertaking the program and a control hospital with normal practice. Routine Queensland health information system data were extracted for analysis. Results Significant reductions in the number of ED presentations per 1000 RACF beds (rate ratio (95 % CI): 0.78 (0.67–0.92); p = 0.002), number of hospital admissions per 1000 RACF beds (0.62 (0.50–0.76); p < 0.0001), and number of hospital admissions per 100 ED presentations (0.61 (0.43–0.85); p = 0.004) were noticed in the experimental hospital after the intervention; while there were no significant differences between intervention and control hospitals before the intervention. Pre-test and post-test comparison in the intervention hospital also presented significant decreases in ED presentation rate (0.75 (0.65–0.86); p < 0.0001) and hospital admission rate per RACF bed (0.66 (0.54–0.79); p < 0.0001), and a non-significant reduction in hospital admission rate per ED presentation (0.82 (0.61–1.11); p = 0.196). Conclusions Hospital in the Nursing Home program could be effective in reducing ED presentations and hospital admissions from RACF residents. Implementation of the program across a variety of settings is preferred to fully assess the ongoing benefits for patients and any possible cost-savings.
Resumo:
Background A cancer diagnosis elicits greater distress than any other medical diagnosis, and yet very few studies have evaluated the efficacy of structured online self-help therapeutic programs to alleviate this distress. This study aims to assess the efficacy over time of an internet Cognitive Behaviour Therapy (iCBT) intervention (‘Finding My Way’) in improving distress, coping and quality of life for individuals with a recent diagnosis of early stage cancer of any type. Methods/Design The study is a multi-site Randomised Controlled Trial (RCT) seeking to enrol 188 participants who will be randomised to either the Finding My Way Intervention or an attention-control condition. Both conditions are delivered online; with 6 modules released once per week, and an additional booster module released one month after program-completion. Participants complete online questionnaires on 4 occasions: at baseline (immediately prior to accessing the modules); post-treatment (immediately after program-completion); then three and six months later. Primary outcomes are general distress and cancer-specific distress, with secondary outcomes including Health-Related Quality of Life (HRQoL), coping, health service utilisation, intervention adherence, and user satisfaction. A range of baseline measures will be assessed as potential moderators of outcomes. Eligible participants are individuals recently diagnosed with any type of cancer, being treated with curative intent, aged over 18 years with sufficient English language literacy, internet access and an active email account and phone number. Participants are blinded to treatment group allocation. Randomisation is computer generated and stratified by gender. Discussion Compared to the few prior published studies, Finding My Way will be the first adequately powered trial to offer an iCBT intervention to curatively treated patients of heterogeneous cancer types in the immediate post-diagnosis/treatment period. If found efficacious, Finding My Way will assist with overcoming common barriers to face-to-face therapy in a cost-effective and accessible way, thus helping to reduce distress after cancer diagnosis and consequently decrease the cancer burden for individuals and the health system. Trial registration Australian New Zealand Clinical Trials Registry ACTRN12613000001796 16.10.13
Resumo:
Many countries over the last decade, have used performance-based contracting (PBC) to manage and maintain roads. The implementation of PBC provides additional benefits for the government/public such as cost savings and improved conditions of contracted road assets. In Australia, PBC is already being implemented on all categories of roads: national, state, urban and rural. Australian PBC arrangement is designed to turn over control and responsibility for roadway system maintenance, rehabilitation, and capital improvement projects to private contractors. Contractors’ responsibilities include determination of treatment types, the design, programming and the undertaking of works needed to maintain road networks at predetermined performance levels. Indonesia initiated two PBC pilot projects in 2011, the Pantura Section Demak-Trengguli (7.68 kilometers) in Central Java Province and Section Ciasem-Pamanukan (18.5 kilometers) in West Java Province. Both sections are categorized as national roads. The contract duration for both of these projects is four years. To facilitate a possible way forward, it is proposed to conduct a study to understand Australia's experiences of advancing from pilot projects to nation-wide programs using PBC. The study focuses on the scope of contracts, bidding processes, risk allocation, and key drivers, using relevant PBC case studies from Australia. Recommendations for future PBC deployment nation-wide should be based on more research associated with risk allocation. This will include investigation of standard conditions of contract. Implications of the contract clauses for the risk management strategy to be adopted by contractors. Based on the nature of risks, some are best managed by the project owner. It is very important that all parties involved to be open to the new rules of contract and to convince themselves about the potential increased benefits of the use of PBC. The most recent states of challenging issues were explored and described.
Resumo:
The built environment is a major contributor to the world’s carbon dioxide emissions, with a considerable amount of energy being consumed in buildings due to heating, ventilation and air-conditioning, space illumination, use of electrical appliances, etc., to facilitate various anthropogenic activities. The development of sustainable buildings seeks to ameliorate this situation mainly by reducing energy consumption. Sustainable building design, however, is a complicated process involving a large number of design variables, each with a range of feasible values. There are also multiple, often conflicting, objectives involved such as the life cycle costs and occupant satisfaction. One approach to dealing with this is through the use of optimization models. In this paper, a new multi-objective optimization model is developed for sustainable building design by considering the design objectives of cost and energy consumption minimization and occupant comfort level maximization. In a case study demonstration, it is shown that the model can derive a set of suitable design solutions in terms of life cycle cost, energy consumption and indoor environmental quality so as to help the client and design team gain a better understanding of the design space and trade-off patterns between different design objectives. The model can very useful in the conceptual design stages to determine appropriate operational settings to achieve the optimal building performance in terms of minimizing energy consumption and maximizing occupant comfort level.
Resumo:
Background The objective is to estimate the incremental cost-effectiveness of the Australian National Hand Hygiene Inititiave implemented between 2009 and 2012 using healthcare associated Staphylococcus aureus bacteraemia as the outcome. Baseline comparators are the eight existing state and territory hand hygiene programmes. The setting is the Australian public healthcare system and 1,294,656 admissions from the 50 largest Australian hospitals are included. Methods The design is a cost-effectiveness modelling study using a before and after quasi-experimental design. The primary outcome is cost per life year saved from reduced cases of healthcare associated Staphylococcus aureus bacteraemia, with cost estimated by the annual on-going maintenance costs less the costs saved from fewer infections. Data were harvested from existing sources or were collected prospectively and the time horizon for the model was 12 months, 2011–2012. Findings No useable pre-implementation Staphylococcus aureus bacteraemia data were made available from the 11 study hospitals in Victoria or the single hospital in Northern Territory leaving 38 hospitals among six states and territories available for cost-effectiveness analyses. Total annual costs increased by $2,851,475 for a return of 96 years of life giving an incremental cost-effectiveness ratio (ICER) of $29,700 per life year gained. Probabilistic sensitivity analysis revealed a 100% chance the initiative was cost effective in the Australian Capital Territory and Queensland, with ICERs of $1,030 and $8,988 respectively. There was an 81% chance it was cost effective in New South Wales with an ICER of $33,353, a 26% chance for South Australia with an ICER of $64,729 and a 1% chance for Tasmania and Western Australia. The 12 hospitals in Victoria and the Northern Territory incur annual on-going maintenance costs of $1.51M; no information was available to describe cost savings or health benefits. Conclusions The Australian National Hand Hygiene Initiative was cost-effective against an Australian threshold of $42,000 per life year gained. The return on investment varied among the states and territories of Australia.
Resumo:
The INFORMAS food prices module proposes a step-wise framework to measure the cost and affordability of population diets. The price differential and the tax component of healthy and less healthy foods, food groups, meals and diets will be benchmarked and monitored over time. Results can be used to model or assess the impact of fiscal policies, such as ‘fat taxes’ or subsidies. Key methodological challenges include: defining healthy and less healthy foods, meals, diets and commonly consumed items; including costs of alcohol, takeaways, convenience foods and time; selecting the price metric; sampling frameworks; and standardizing collection and analysis protocols. The minimal approach uses three complementary methods to measure the price differential between pairs of healthy and less healthy foods. Specific challenges include choosing policy relevant pairs and defining an anchor for the lists. The expanded approach measures the cost of a healthy diet compared to the current (less healthy) diet for a reference household. It requires dietary principles to guide the development of the healthy diet pricing instrument and sufficient information about the population’s current intake to inform the current (less healthy) diet tool. The optimal approach includes measures of affordability and requires a standardised measure of household income that can be used for different countries. The feasibility of implementing the protocol in different countries is being tested in New Zealand, Australia and Fiji. The impact of different decision points to address challenges will be investigated in a systematic manner. We will present early insights and results from this work.
Developing standardized methods to assess cost of healthy and unhealthy (current) diets in Australia
Resumo:
Unhealthy diets contribute at least 14% to Australia's disease burden and are driven by ‘obesogenic’ food environments. Compliance with dietary recommendations is particularly poor amongst disadvantaged populations including low socioeconomic groups, those living in rural/remote areas and Aboriginal and Torres Strait Islanders. The perception that healthy foods are expensive is a key barrier to healthy choices and a major determinant of diet-related health inequities. Available state/regional/local data (limited and non-comparable) suggests that, despite basic healthy foods not incurring GST, the cost of healthy food is higher and has increased more rapidly than unhealthy food over the last 15 years in Australia. However, there were no nationally standardised tools or protocols to benchmark, compare or monitor food prices and affordability in Australia. Globally, we are leading work to develop and test approaches to assess the price differential of healthy and less-healthy (current) diets under the food price module of the International Network for Food and Obesity/non-communicable diseases (NCDs) Research, Monitoring and Action Support (INFORMAS). This presentation describes contextualization of the INFORMAS approach to develop standardised Australian tools, survey protocols and data collection and analysis systems. The ‘healthy diet basket’ was based on the Australian Foundation Diet, 1 The ‘current diet basket’ and specific items included in each basket, were based on recent national dietary survey data.2 Data collection methods were piloted. The final tools and protocols were then applied to measure the price and affordability of healthy and less healthy (current) diets of different household groups in diverse communities across the nation. We have compared results for different geographical locations/population subgroups in Australia and assessed these against international INFORMAS benchmarks. The results inform the development of policy and practice, including those relevant to mooted changes to the GST base, to promote nutrition and healthy weight and prevent chronic disease in Australia.
Resumo:
OBJECTIVE To report the cost-effectiveness of a tailored handheld computerized procedural preparation and distraction intervention (Ditto) used during pediatric burn wound care in comparison to standard practice. METHODS An economic evaluation was performed alongside a randomized controlled trial of 75 children aged 4 to 13 years who presented with a burn to the Royal Children's Hospital, Brisbane, Australia. Participants were randomized to either the Ditto intervention (n = 35) or standard practice (n = 40) to measure the effect of the intervention on days taken for burns to re-epithelialize. Direct medical, direct nonmedical, and indirect cost data during burn re-epithelialization were extracted from the randomized controlled trial data and combined with scar management cost data obtained retrospectively from medical charts. Nonparametric bootstrapping was used to estimate statistical uncertainty in cost and effect differences and cost-effectiveness ratios. RESULTS On average, the Ditto intervention reduced the time to re-epithelialize by 3 days at AU$194 less cost for each patient compared with standard practice. The incremental cost-effectiveness plane showed that 78% of the simulated results were within the more effective and less costly quadrant and 22% were in the more effective and more costly quadrant, suggesting a 78% probability that the Ditto intervention dominates standard practice (i.e., cost-saving). At a willingness-to-pay threshold of AU$120, there is a 95% probability that the Ditto intervention is cost-effective (or cost-saving) against standard care. CONCLUSIONS This economic evaluation showed the Ditto intervention to be highly cost-effective against standard practice at a minimal cost for the significant benefits gained, supporting the implementation of the Ditto intervention during burn wound care.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
The Western European house mouse, Mus musculus domesticus, is well-known for the high frequency of Robertsonian fusions that have rapidly produced more than 50 karyotipic races, making it an ideal model for studying the mechanisms of chromosomal speciation. The mouse mandible is one of the traits studied most intensively to investigate the effect of Robertsonian fusions on phenotypic variation within and between populations. This complex bone structure has also been widely used to study the level of integration between different morphogenetic units. Here, with the aim of testing the effect of different karyotypic assets on the morphology of the mouse mandible and on its level of modularity, we performed morphometric analyses of mice from a contact area between two highly metacentric races in Central Italy. We found no difference in size, while the mandible shape was found to be different between the two Robertsonian races, even after accounting for the genetic relationships among individuals and geographic proximity. Our results support the existence of two modules that indicate a certain degree of evolutionary independence, but no difference in the strength of modularity between chromosomal races. Moreover, the ascending ramus showed more pronounced interpopulation/race phenotypic differences than the alveolar region, an effect that could be associated to their different polygenic architecture. This study suggests that chromosomal rearrangements play a role in the house mouse phenotypic divergence, and that the two modules of the mouse mandible are differentially affected by environmental factors and genetic makeup.
A combination of local inflammation and central memory T cells potentiates immunotherapy in the skin
Resumo:
Adoptive T cell therapy uses the specificity of the adaptive immune system to target cancer and virally infected cells. Yet the mechanism and means by which to enhance T cell function are incompletely described, especially in the skin. In this study, we use a murine model of immunotherapy to optimize cell-mediated immunity in the skin. We show that in vitro - derived central but not effector memory-like T cells bring about rapid regression of skin-expressing cognate Ag as a transgene in keratinocytes. Local inflammation induced by the TLR7 receptor agonist imiquimod subtly yet reproducibly decreases time to skin graft rejection elicited by central but not effector memory T cells in an immunodeficient mouse model. Local CCL4, a chemokine liberated by TLR7 agonism, similarly enhances central memory T cell function. In this model, IL-2 facilitates the development in vivo of effector function from central memory but not effector memory T cells. In a model of T cell tolerogenesis, we further show that adoptively transferred central but not effector memory T cells can give rise to successful cutaneous immunity, which is dependent on a local inflammatory cue in the target tissue at the time of adoptive T cell transfer. Thus, adoptive T cell therapy efficacy can be enhanced if CD8+ T cells with a central memory T cell phenotype are transferred, and IL-2 is present with contemporaneous local inflammation. Copyright © 2012 by The American Association of Immunologists, Inc.
Resumo:
This research develops a design support system, which is able to estimate the life cycle cost of different product families at the early stage of product development. By implementing the system, a designer is able to develop various cost effective product families in a shorter lead-time and minimise the destructive impact of the product family on the environment.