922 resultados para Homogeneous cost pool method
Resumo:
Platelet-rich plasma (PRP) is a volume of plasma fraction of autologous blood having platelet concentrations above baseline whole-blood values due to processing and concentration. PRP is used in various surgical fields to enhance soft-tissue and bone healing by delivering supra-physiological concentrations of autologous platelets at the site of tissue damage. These preparations may provide a good cellular source of various growth factors and cytokines, and modulate tissue response to injury. Common clinically available materials for blood preparations combined with a two-step centrifugation protocol at 280g each, to ensure cellular component integrity, provided platelet preparations which were concentrated 2-3 fold over total blood values. Costs were shown to be lower than those of other methods which require specific equipment and high-cost disposables, while safety and traceability can be increased. PRP can be used for the treatment of wounds of all types including burns and also of split-thickness skin graft donor sites, which are frequently used in burn management. The procedure can be standardized and is easy to adapt in clinical settings with minimal infrastructure, thus enabling large numbers of patients to benefit from a form of cellular therapy.
Resumo:
We o¤er an axiomatization of the serial cost-sharing method of Friedman and Moulin (1999). The key property in our axiom system is Group Demand Monotonicity, asking that when a group of agents raise their demands, not all of them should pay less.
Resumo:
Implementing precise techniques in routine diagnosis of chronic granulomatous disease (CGD), which expedite the screening of molecular defects, may be critical for a quick assumption of patient prognosis. This study compared the efficacy of single-strand conformation polymorphism analysis (SSCP) and high-performance liquid chromatography under partially denaturing conditions (dHPLC) for screening mutations in CGD patients. We selected 10 male CGD patients with a clinical history of severe recurrent infections and abnormal respiratory burst function. gDNA, mRNA and cDNA samples were prepared by standard methods. CYBB exons were amplified by PCR and screened by SSCP or dHPLC. Abnormal DNA fragments were sequenced to reveal the nature of the mutations. The SSCP and dHPLC methods showed DNA abnormalities, respectively, in 55% and 100% of the cases. Sequencing of the abnormal DNA samples confirmed mutations in all cases. Four novel mutations in CYBB were identified which were picked up only by the dHPLC screening (c.904 insC, c.141+5 g>t, c.553 T>C, and c.665 A>T). This work highlights the relevance of dHPLC, a sensitive, fast, reliable and cost-effective method for screening mutations in CGD, which in combination with functional assays assessing the phagocyte respiratory burst will contribute to expedite the definitive diagnosis of X-linked CGD, direct treatment, genetic counselling and to have a clear assumption of the prognosis. This strategy is especially suitable for developing countries.
Resumo:
A homogeneous DNA diagnostic assay based on template-directed primer extension detected by fluorescence resonance energy transfer, named template-directed dye-terminator incorporation (TDI) assay, has been developed for mutation detection and high throughput genome analysis. Here, we report the successful application of the TDI assay to detect mutations in the cystic fibrosis transmembrane conductance regulator (CFTR) gene, the human leukocyte antigen H (HLA-H) gene, and the receptor tyrosin kinase (RET) protooncogene that are associated with cystic fibrosis, hemochromatosis, and multiple endocrine neoplasia, type 2, respectively. Starting with total human DNA, the samples are amplified by the PCR followed by enzymatic degradation of excess primers and deoxyribonucleoside triphosphates before the primer extension reaction is performed. All these standardized steps are performed in the same tube, and the fluorescence changes are monitored in real time, making it a useful clinical DNA diagnostic method.
Resumo:
Työn tavoitteena oli kehittää tutkittavan insinööriyksikön projektien kustannusestimointiprosessia, siten että yksikön johdolla olisi tulevaisuudessa käytettävänään tarkempaa kustannustietoa. Jotta tämä olisi mahdollista, ensin täytyi selvittää yksikön toimintatavat, projektien kustannusrakenteet sekä kustannusatribuutit. Tämän teki mahdolliseksi projektien kustannushistoriatiedon tutkiminen sekä asiantuntijoiden haastattelu. Työn tuloksena syntyi kohdeyksikön muiden prosessien kanssa yhteensopiva kustannusestimointiprosessi sekä –malli.Kustannusestimointimenetelmän ja –mallin perustana on kustannusatribuutit, jotka määritellään erikseen tutkittavassa ympäristössä. Kustannusatribuutit löydetään historiatietoa tutkimalla, eli analysoimalla jo päättyneitä projekteja, projektien kustannusrakenteita sekä tekijöitä, jotka ovat vaikuttaneet kustannusten syntyyn. Tämän jälkeen kustannusatribuuteille täytyy määritellä painoarvot sekä painoarvojen vaihteluvälit. Estimointimallin tarkuutta voidaan parantaa mallin kalibroinnilla. Olen käyttänyt Goal – Question – Metric (GQM) –menetelmää tutkimuksen kehyksenä.
Resumo:
Quality is not only free but it can be a profit maker. Every dollar that is not spent on doing things wrong becomes a dollar right on the bottom line. The main objective of this thesis is to give an answer on how cost of poor quality can be measured theoretically correctly. Different calculation methods for cost of poor quality are presented and discussed in order to give comprehensive picture about measurement process. The second objective is to utilize the knowledge from the literature review and to apply it when creating a method for measuring cost of poor quality in supplier performance rating. Literature review indicates that P-A-F model together with ABC methodology provides a mean for quality cost calculations. These models give an answer what should be measured and how this measurement should be carried out. However, when product or service quality costs are incurred when quality character derivates from target value, then QLF seems to be most appropriate methodology for quality cost calculation. These methodologies were applied when creating a quality cost calculation method for supplier performance ratings.
Resumo:
We ask how the three known mechanisms for solving cost sharing problems with homogeneous cost functions - the value, the proportional, and the serial mechanisms - should be extended to arbitrary problem. We propose the Ordinality axiom, which requires that cost shares be invariante under all transactions preserving the nature of a cost sharing problem.
Resumo:
Several colorimetric and chromatographic methods have been used for the identification and quantification of methyldopa (MA) in pharmaceutical formulations and clinical samples. However, these methods are time- and reagent-consuming, which stimulated our efforts to develop a simple, fast, and low-cost alternative method. We carried out an electroanalytical method for the determination of MA in pharmaceutical formulations using the crude enzymatic extract of laccase from Pycnoporus sanguineus as oxidizing agent. This method is based on the biochemical oxidation of MA by laccase (LAC), both in solution, followed by electrochemical reduction on glassy carbon electrode surface. This method was employed for the determination of MA in pure and pharmaceutical formulations and compared with the results obtained using the official method. A wide linear curve from 23 x 10(-5) to 1 x 10(-4) mol L(-1) was found with a detection limit calculated from 43 x 10(-6) mol L(-1).
Resumo:
[EN]The meccano method is a novel and promising mesh generation method for simultaneously creating adaptive tetrahedral meshes and volume parametrizations of a complex solid. We highlight the fact that the method requires minimum user intervention and has a low computational cost. The method builds a 3-D triangulation of the solid as a deformation of an appropriate tetrahedral mesh of the meccano. The new mesh generator combines an automatic parametrization of surface triangulations, a local refinement algorithm for 3-D nested triangulations and a simultaneous untangling and smoothing procedure. At present, the procedure is fully automatic for a genus-zero solid. In this case, the meccano can be a single cube. The efficiency of the proposed technique is shown with several applications...
Resumo:
Supermarket nutrient movement, a community food consumption measure, aggregated 1,023 high-fat foods, representing 100% of visible fats and approximately 44% of hidden fats in the food supply (FAO, 1980). Fatty acid and cholesterol content of foods shipped from the warehouse to 47 supermarkets located in the Houston area were calculated over a 6 month period. These stores were located in census tracts with over 50% of a given ethnicity: Hispanic, black non-Hispanic, or white non-Hispanic. Categorizing the supermarket census tracts by predominant ethnicity, significant differences were found by ANOVA in the proportion of specific fatty acids and cholesterol content of the foods examined. Using ecological regression, ethnicity, income, and median age predicted supermarket lipid movements while residential stability did not. No associations were found between lipid movements and cardiovascular disease mortality, making further validation necessary for epidemiological application of this method. However, it has been shown to be a non-reactive and cost-effective method appropriate for tracking target foods in populations of groups, and for assessing the impact of mass media nutrition education, legislation, and fortification on community food and nutrient purchase patterns. ^
Resumo:
Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.
Resumo:
Nowadays, there is an increasing number of robotic applications that need to act in real three-dimensional (3D) scenarios. In this paper we present a new mobile robotics orientated 3D registration method that improves previous Iterative Closest Points based solutions both in speed and accuracy. As an initial step, we perform a low cost computational method to obtain descriptions for 3D scenes planar surfaces. Then, from these descriptions we apply a force system in order to compute accurately and efficiently a six degrees of freedom egomotion. We describe the basis of our approach and demonstrate its validity with several experiments using different kinds of 3D sensors and different 3D real environments.
Resumo:
This economic evaluation was part of the Australian National Evaluation of Pharmacotherapies for Opioid Dependence (NEPOD) project. Data from four trials of heroin detoxification methods, involving 365 participants, were pooled to enable a comprehensive comparison of the cost-effectiveness of five inpatient and outpatient detoxification methods. This study took the perspective of the treatment provider in assessing resource use and costs. Two short-term outcome measures were used-achievement of an initial 7-day period of abstinence, and entry into ongoing post-detoxification treatment. The mean costs of the various detoxification methods ranged widely, from AUD $491 (buprenorphine-based outpatient); to AUD $605 for conventional outpatient; AUD $1404 for conventional inpatient; AUD $1990 for rapid detoxification under sedation; and to AUD $2689 for anaesthesia per episode. An incremental cost-effectiveness analysis was carried out using conventional outpatient detoxification as the base comparator. The buprenorphine-based outpatient detoxification method was found to be the most cost-effective method overall, and rapid opioid detoxification under sedation was the most costeffective inpatient method.