400 resultados para Exergetic Manufacturing Cost (EMC)
Resumo:
A well-engineered scaffold for regenerative medicine, which is suitable to be translated from the bench to the bedside, combines inspired design, technical innovation and precise craftsmanship. Electrospinning and additive manufacturing are separate approaches to manufacturing scaffolds for a variety of tissue engineering applications. A need to accurately control the spatial distribution of pores within scaffolds has recently resulted in combining the two processing methods, to overcome shortfalls in each technology. This review describes where electrospinning and additive manufacturing are used together to generate new porous structures for biological applications.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
Dear Editor We thank Dr Klek for his interest in our article and giving us the opportunity to clarify our study and share our thoughts. Our study looks at the prevalence of malnutrition in an acute tertiary hospital and tracked the outcomes prospectively.1 There are a number of reasons why we chose Subjective Global Assessment (SGA) to determine the nutritional status of patients. Firstly, we took the view that nutrition assessment tools should be used to determine nutrition status and diagnose presence and severity of malnutrition; whereas the purpose of nutrition screening tools are to identify individuals who are at risk of malnutrition. Nutritional assessment rather than screening should be used as the basis for planning and evaluating nutrition interventions for those diagnosed with malnutrition. Secondly, Subjective Global Assessment (SGA) has been well accepted and validated as an assessment tool to diagnose the presence and severity of malnutrition in clinical practice.2, 3 It has been used in many studies as a valid prognostic indicator of a range of nutritional and clinical outcomes.4, 5, 6 On the other hand, Malnutrition Universal Screening Tool (MUST)7 and Nutrition Risk Screening 2002 (NRS 2002)8 have been established as screening rather than assessment tools.
Resumo:
Learning capability (LC) is a special dynamic capability that a firm purposefully builds to develop a cognitive focus, so as to enable the configuration and improvement of other capabilities (both dynamic and operational) to create and respond to market changes. Empirical evidence regarding the essential role of LC in leveraging operational manufacturing capabilities is, however, limited in the literature. This study takes a routine-based approach to understand capability, and focuses on demonstrating leveraging power of LC upon two essential operational capabilities within the manufacturing context, i.e., operational new product development capability (ONPDC), and operational supplier integration capability (OSIC). A mixed-methods research framework was used, which combines sources of evidence derived from a survey study and a multiple case study. This study identified high-level routines of LC that can be designed and controlled by managers and practitioners, to reconfigure underlying routines of ONPDC and OSIC to achieve superior performance in a turbulent environment. Hence, the study advances the notion of knowledge-based dynamic capabilities, such as LC, as routine bundles. It also provides an impetus for managing manufacturing operations from a capability-based perspective in the fast changing knowledge era.
Resumo:
This study seeks insights into the economic consequences of accounting conservatism by examining the relation between conservatism and cost of equity capital. Appealing to the analytical and empirical literatures, we posit an inverse relation. Importantly, we also posit that the strength of the relation is conditional on the firm’s information environment, being the strongest for firms with high information asymmetry and the weakest (potentially negligible) for firms with low information asymmetry. Based on a sample of US-listed entities, we find, as predicted, an inverse relation between conservatism and the cost of equity capital, but further, that this relation is diminished for firms with low information asymmetry environments. This evidence indicates that there are economic benefits associated with the adoption of conservative reporting practices and leads us to conclude that conservatism has a positive role in accounting principles and practices, despite its increasing rejection by accounting standard setters.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.
Resumo:
Mortality and cost outcomes of elderly intensive care unit (ICU) trauma patients were characterised in a retrospective cohort study from an Australian tertiary ICU. Trauma patients admitted between January 2000 and December 2005 were grouped into three major age categories: aged ≥65 years admitted into ICU (n=272); aged ≥65 years admitted into general ward (n=610) and aged <65 years admitted into ICU (n=1617). Hospital mortality predictors were characterised as odds ratios (OR) using logistic regression. The impact of predictor variables on (log) total hospital-stay costs was determined using least squares regression. An alternate treatment-effects regression model estimated the mortality cost-effect as an endogenous variable. Mortality predictors (P ≤0.0001, comparator: ICU ≥65 years, ventilated) were: ICU <65 not-ventilated (OR 0.014); ICU <65 ventilated (OR 0.090); ICU age ≥65 not-ventilated (OR 0.061) and ward ≥65 (OR 0.086); increasing injury severity score and increased Charlson comorbidity index of 1 and 2, compared with zero (OR 2.21 [1.40 to 3.48] and OR 2.57 [1.45 to 4.55]). The raw mean daily ICU and hospital costs in A$ 2005 (US$) for age <65 and ≥65 to ICU, and ≥65 to the ward were; for year 2000: ICU, $2717 (1462) and $2777 (1494); hospital, $1837 (988) and $1590 (855); ward $933 (502); for year 2005: ICU, $3202 (2393) and $3086 (2307); hospital, $1938 (1449) and $1914 (1431); ward $1180 (882). Cost increments were predicted by age ≥65 and ICU admission, increasing injury severity score, mechanical ventilation, Charlson comorbidity index increments and hospital survival. Mortalitycost-effect was estimated at -63% by least squares regression and -82% by treatment-effects regression model. Patient demographic factors, injury severity and its consequences predict both cost and survival in trauma. The cost mortality effect was biased upwards by conventional least squares regression estimation.
Resumo:
The ability to steer business operations in alignment with the true origins of costs, and to be informed about this on a real-time basis, allows businesses to increase profitability. In most organisations however, high-level cost-based managerial decisions are still being made separately from process-related operational decisions. In this paper, we describe how process-related decisions at the operational level can be guided by cost considerations and how these cost-informed decision rules can be supported by a workflow management system. The paper presents the conceptual framework together with data requirements and technical challenges that need to be addressed to realise cost-informed workflow execution. The feasibility of our approach is demonstrated using a prototype implementation in the YAWL workflow environment.
Resumo:
The Action Lecture program is an innovative teaching method run in some nursery and primary schools in Paris and designed to improve pupils’ literacy. We report the results of an evaluation of this program. We describe the experimental protocol that was built to estimate the program’s impact on several types of indicators. Data were processed following a Differences-in-Differences (DID) method. Then we use the estimation of the impact on academic achievement to conduct a cost-effectiveness analysis and take a reduction of the class size program as a benchmark. The results are positive for the Action Lecture program.
Resumo:
The opening of the Australian economy in a globalised world has led to Australian garment and retail corporations moving their manufacturing overseas and acquiring goods from overseas providers. This is usually better for the corporations’ bottom-line, as they can purchase goods overseas at a fraction of their local cost, partly due to cheap labour. Australia is one of the many OECD countries not to have a well regulated environment for workplace human rights. This study examines 18 major Australian retail and garment manufacturing corporations and finds that workplace human rights reporting is poor, based on content analysis of their annual reports, corporate social responsibility reports and websites. This is probably due to the failure of the Australian Government to provide adequate oversight by promulgating mandatory reporting standards for both local and overseas operations of Australian companies. This permits corporations to avoid reporting their workplace human rights standards and breaches.
Resumo:
Background Falls are one of the most frequently occurring adverse events that impact upon the recovery of older hospital inpatients. Falls can threaten both immediate and longer-term health and independence. There is need to identify cost-effective means for preventing falls in hospitals. Hospital-based falls prevention interventions tested in randomized trials have not yet been subjected to economic evaluation. Methods Incremental cost-effectiveness analysis was undertaken from the health service provider perspective, over the period of hospitalization (time horizon) using the Australian Dollar (A$) at 2008 values. Analyses were based on data from a randomized trial among n = 1,206 acute and rehabilitation inpatients. Decision tree modeling with three-way sensitivity analyses were conducted using burden of disease estimates developed from trial data and previous research. The intervention was a multimedia patient education program provided with trained health professional follow-up shown to reduce falls among cognitively intact hospital patients. Results The short-term cost to a health service of one cognitively intact patient being a faller could be as high as A$14,591 (2008). The education program cost A$526 (2008) to prevent one cognitively intact patient becoming a faller and A$294 (2008) to prevent one fall based on primary trial data. These estimates were unstable due to high variability in the hospital costs accrued by individual patients involved in the trial. There was a 52% probability the complete program was both more effective and less costly (from the health service perspective) than providing usual care alone. Decision tree modeling sensitivity analyses identified that when provided in real life contexts, the program would be both more effective in preventing falls among cognitively intact inpatients and cost saving where the proportion of these patients who would otherwise fall under usual care conditions is at least 4.0%. Conclusions This economic evaluation was designed to assist health care providers decide in what circumstances this intervention should be provided. If the proportion of cognitively intact patients falling on a ward under usual care conditions is 4% or greater, then provision of the complete program in addition to usual care will likely both prevent falls and reduce costs for a health service.
Resumo:
In this paper, we present a monocular vision based autonomous navigation system for Micro Aerial Vehicles (MAVs) in GPS-denied environments. The major drawback of monocular systems is that the depth scale of the scene can not be determined without prior knowledge or other sensors. To address this problem, we minimize a cost function consisting of a drift-free altitude measurement and up-to-scale position estimate obtained using the visual sensor. We evaluate the scale estimator, state estimator and controller performance by comparing with ground truth data acquired using a motion capture system. All resources including source code, tutorial documentation and system models are available online.
Resumo:
This paper describes a risk model for estimating the likelihood of collisions at low-exposure railway level crossings, demonstrating the effect that differences in safety integrity can have on the likelihood of a collision. The model facilitates the comparison of safety benefits between level crossings with passive controls (stop or give-way signs) and level crossings that have been hypothetically upgraded with conventional or low-cost warning devices. The scenario presented illustrates how treatment of a cross-section of level crossings with low cost devices can provide a greater safety benefit compared to treatment with conventional warning devices for the same budget.
Resumo:
Genomic DNA obtained from patient whole blood samples is a key element for genomic research. Advantages and disadvantages, in terms of time-efficiency, cost-effectiveness and laboratory requirements, of procedures available to isolate nucleic acids need to be considered before choosing any particular method. These characteristics have not been fully evaluated for some laboratory techniques, such as the salting out method for DNA extraction, which has been excluded from comparison in different studies published to date. We compared three different protocols (a traditional salting out method, a modified salting out method and a commercially available kit method) to determine the most cost-effective and time-efficient method to extract DNA. We extracted genomic DNA from whole blood samples obtained from breast cancer patient volunteers and compared the results of the product obtained in terms of quantity (concentration of DNA extracted and DNA obtained per ml of blood used) and quality (260/280 ratio and polymerase chain reaction product amplification) of the obtained yield. On average, all three methods showed no statistically significant differences between the final result, but when we accounted for time and cost derived for each method, they showed very significant differences. The modified salting out method resulted in a seven- and twofold reduction in cost compared to the commercial kit and traditional salting out method, respectively and reduced time from 3 days to 1 hour compared to the traditional salting out method. This highlights a modified salting out method as a suitable choice to be used in laboratories and research centres, particularly when dealing with a large number of samples.