956 resultados para Translog cost function
Resumo:
An interferometric technique was used to determine the temperature coefficient of the optical path length (dS/dT) as a function of the temperature in several optical glasses. The temperature range was between 25degreesC and 180degreesC. The studied samples included undoped and doped oxide glasses, such as low silica calcium aluminosilicate, phosphates, borates and also chalcogenides. The oxide glasses had dS/dT between 10 X 10(-6) K-1 and 20x10(-6) K-1, while for the chalcogenides, these were around 70 x 10(-6)K(-1). The results showed that dS/dTs increased with the temperature in all samples. For samples doped with Nd the dS/dT values were found to be independent of concentration. on the other hand, for the phosphate glass doped with Cr, dS/dT increased about 5% when compared with the Nd doped one. In conclusion, the used interferometric method, which is a considerably simpler and a lower cost technique, and is a useful tool to measure dS/dT in semi-transparent glasses as a function of the composition and temperature. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
X-ray computed tomography (CT) refers to the cross-sectional imaging of an object measuring the transmitted radiation at different directions. In this work, we describe the development of a low cost micro-CT X-ray scanner that is being developed for nondestructive testing. This tomograph operates using a microfocus X-ray source and contains a silicon photodiode as detectors. The performance of the system, by its spatial resolution, has been estimated through its Modulation Transfer Function - MTF and the obtained value at 10% of MTF is 661 μm. It was built as a general purpose nondestructive testing device. © 2009 American Institute of Physics.
Resumo:
Different solid composites made by mechanical dispersions of graphite particles into heated paraffin (from 65 to 80% graphite, in mass) were prepared and assessed in order to optimize their use in electrochemical and electroanalytical procedures for bioanalysis. Besides these, composites were also evaluated by thermoanalytical techniques aiming to study their conservation and long-term stability (over eight months without special care), among others. Best results were found at 80% m/m graphite in paraffin. Such electrode combines low-cost, stability, sensitivity, ease of maintenance and clearance, besides the possibilities of manufacture in many different forms and shapes (with or without modifications) and applicability in a wide range of pH. Electrochemical studies by different voltammetric techniques involving vitamins from complex B (riboflavin and pyridoxine) leaded to a better understanding about their electrooxidative processes onto carbon-composite electrodes, specially regarding reversibility and pH-dependence. Data were also acquired and optimized with analytical purposes, being square-wave voltammetry in pH 4.2 chosen by its many advantages. Good linearity between peak responses as function of concentration were reached from 5 to 43 μmol L-1 for riboflavin (peak at -0.257 V) and up to 8.5 × 10-4 mol L -1 for pyridoxine (peak at +1.04 V), best studied conditions; limits of detection (at an S/N of 3) for both analites showed to be circa 1.0 mol L-1. Different commercial samples were analyzed for riboflavin (EMS® complex B syrup) and pyridoxine (Citoneurin 5000 Merck® ampoules) providing 96.6% and 98.7% recoveries, respectively.
Resumo:
Implementing precise techniques in routine diagnosis of chronic granulomatous disease (CGD), which expedite the screening of molecular defects, may be critical for a quick assumption of patient prognosis. This study compared the efficacy of single-strand conformation polymorphism analysis (SSCP) and high-performance liquid chromatography under partially denaturing conditions (dHPLC) for screening mutations in CGD patients. We selected 10 male CGD patients with a clinical history of severe recurrent infections and abnormal respiratory burst function. gDNA, mRNA and cDNA samples were prepared by standard methods. CYBB exons were amplified by PCR and screened by SSCP or dHPLC. Abnormal DNA fragments were sequenced to reveal the nature of the mutations. The SSCP and dHPLC methods showed DNA abnormalities, respectively, in 55% and 100% of the cases. Sequencing of the abnormal DNA samples confirmed mutations in all cases. Four novel mutations in CYBB were identified which were picked up only by the dHPLC screening (c.904 insC, c.141+5 g>t, c.553 T>C, and c.665 A>T). This work highlights the relevance of dHPLC, a sensitive, fast, reliable and cost-effective method for screening mutations in CGD, which in combination with functional assays assessing the phagocyte respiratory burst will contribute to expedite the definitive diagnosis of X-linked CGD, direct treatment, genetic counselling and to have a clear assumption of the prognosis. This strategy is especially suitable for developing countries.
Resumo:
Background-The Second Medicine, Angioplasty, or Surgery Study (MASS II) included patients with multivessel coronary artery disease and normal systolic ventricular function. Patients underwent coronary artery bypass graft surgery (CABG, n = 203), percutaneous coronary intervention (PCI, n = 205), or medical treatment alone (MT, n = 203). This investigation compares the economic outcome at 5-year follow-up of the 3 therapeutic strategies. Methods and Results-We analyzed cumulative costs during a 5-year follow-up period. To analyze the cost-effectiveness, adjustment was made on the cumulative costs for average event-free time and angina-free proportion. Respectively, for event-free survival and event plus angina-free survival, MT presented 3.79 quality-adjusted life-years and 2.07 quality-adjusted life-years; PCI presented 3.59 and 2.77 quality-adjusted life-years; and CABG demonstrated 4.4 and 2.81 quality-adjusted life-years. The event-free costs were $9071.00 for MT; $19 967.00 for PCI; and $18 263.00 for CABG. The paired comparison of the event-free costs showed that there was a significant difference favoring MT versus PCI (P<0.01) and versus CABG (P<0.01) and CABG versus PCI (P<0.01). The event-free plus angina-free costs were $16 553.00, $25 831.00, and $24 614.00, respectively. The paired comparison of the event-free plus angina-free costs showed that there was a significant difference favoring MT versus PCI (P=0.04), and versus CABG (P<0.001); there was no difference between CABG and PCI (P>0.05). Conclusions-In the long-term economic analysis, for the prevention of a composite primary end point, MT was more cost effective than CABG, and CABG was more cost-effective than PCI.
Resumo:
Using the directional distance function we study a cross section of 110 countries to examine the efficiency of management of the tradeoffs between pollution and income. The DEA model is reformulated to permit 'reverse disposability' of the bad output. Further, we interpret the optimal solution of the multiplier form of the DEA model as an iso-inefficiency line. This permits us to measure the shadow cost of the bad output for a country that is in the interior, rather than on the frontier of the production possibilities set. We also compare the relative environmental performance of countries in terms of emission intensity adjusted for technical efficiency. Only 10% of the countries are found to be on the frontier. Also, there is considerable inter-country variation in the imputed opportunity cost of CO2 reduction. Further, differences in technical efficiency contribute substantially to differences in the observed levels of CO2 intensity.
Resumo:
This paper contributes with a unified formulation that merges previ- ous analysis on the prediction of the performance ( value function ) of certain sequence of actions ( policy ) when an agent operates a Markov decision process with large state-space. When the states are represented by features and the value function is linearly approxi- mated, our analysis reveals a new relationship between two common cost functions used to obtain the optimal approximation. In addition, this analysis allows us to propose an efficient adaptive algorithm that provides an unbiased linear estimate. The performance of the pro- posed algorithm is illustrated by simulation, showing competitive results when compared with the state-of-the-art solutions.
Resumo:
Many computer vision and human-computer interaction applications developed in recent years need evaluating complex and continuous mathematical functions as an essential step toward proper operation. However, rigorous evaluation of this kind of functions often implies a very high computational cost, unacceptable in real-time applications. To alleviate this problem, functions are commonly approximated by simpler piecewise-polynomial representations. Following this idea, we propose a novel, efficient, and practical technique to evaluate complex and continuous functions using a nearly optimal design of two types of piecewise linear approximations in the case of a large budget of evaluation subintervals. To this end, we develop a thorough error analysis that yields asymptotically tight bounds to accurately quantify the approximation performance of both representations. It provides an improvement upon previous error estimates and allows the user to control the trade-off between the approximation error and the number of evaluation subintervals. To guarantee real-time operation, the method is suitable for, but not limited to, an efficient implementation in modern Graphics Processing Units (GPUs), where it outperforms previous alternative approaches by exploiting the fixed-function interpolation routines present in their texture units. The proposed technique is a perfect match for any application requiring the evaluation of continuous functions, we have measured in detail its quality and efficiency on several functions, and, in particular, the Gaussian function because it is extensively used in many areas of computer vision and cybernetics, and it is expensive to evaluate.
Resumo:
Despite the benefits of resistance, susceptibility to infectious disease is commonplace. Although specific susceptibility may be considered an inevitable consequence of the co-evolutionary arms race between parasite and host, a more general constraint may arise from the cost of an immune response. This “cost” hypothesis predicts a tradeoff between immune defense and other components of fitness. In particular, a tradeoff between immunity and sexually selected male behavior has been proposed. Here we provide experimental support for the direct phenotypic tradeoff between sexual activity and immunity by studying the antibacterial immune response in Drosophila melanogaster. Males exposed to more females showed a reduced ability to clear a bacterial infection, an effect that we experimentally link to changes in sexual activity. Our results suggest immunosuppression is an important cost of reproduction and that immune function and levels of disease susceptibility will be influenced by sexual selection.
Resumo:
The purpose of this research was to estimate the cost-effectiveness of two rehabilitation interventions for breast cancer survivors, each compared to a population-based, non-intervention group (n = 208). The two services included an early home-based physiotherapy intervention (DAART, n = 36) and a group-based exercise and psychosocial intervention (STRETCH, n = 31). A societal perspective was taken and costs were included as those incurred by the health care system, the survivors and community. Health outcomes included: (a) 'rehabilitated cases' based on changes in health-related quality of life between 6 and 12 months post-diagnosis, using the Functional Assessment of Cancer Therapy - Breast Cancer plus Arm Morbidity (FACT-B+4) questionnaire, and (b) quality-adjusted life years (QALYs) using utility scores from the Subjective Health Estimation (SHE) scale. Data were collected using self-reported questionnaires, medical records and program budgets. A Monte-Carlo modelling approach was used to test for uncertainty in cost and outcome estimates. The proportion of rehabilitated cases was similar across the three groups. From a societal perspective compared with the non-intervention group, the DAART intervention appeared to be the most efficient option with an incremental cost of $1344 per QALY gained, whereas the incremental cost per QALY gained from the STRETCH program was $14,478. Both DAART and STRETCH are low-cost, low-technological health promoting programs representing excellent public health investments.
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
We discuss recent progress towards the establishment of important structure-property-function relationships in eumelanins-key functional bio-macromolecular systems responsible for photoprotection and immune response in humans, and implicated in the development of melanoma skin cancer. We focus on the link between eumelanin's secondary structure and optical properties such as broad band UV-visible absorption and strong non-radiative relaxation; both key features of the photo-protective function. We emphasise the insights gained through a holistic approach combining optical spectroscopy with first principles quantum chemical calculations, and advance the hypothesis that the robust functionality characteristic of eumelanin is related to extreme chemical and structural disorder at the secondary level. This inherent disorder is a low cost natural resource, and it is interesting to speculate as to whether it may play a role in other functional bio-macromolecular systems.
Resumo:
We report the realization of low-cost in-fiber WDM device function utilizing efficient side-detection of strong radiation mode out-coupling from tilted FBGs. The spatial-to-spectral conversion efficiency as high as 0.32 mm/nm is demonstrated.
Resumo:
Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.