890 resultados para phi value analysis
Resumo:
Tuberculous meningitis (TBM) is a severe infection of the central nervous system, particularly in developing countries. Prompt diagnosis and treatment are necessary to decrease the high rates of disability and death associated with TBM. The diagnosis is often time and labour intensive; thus, a simple, accurate and rapid diagnostic test is needed. The adenosine deaminase (ADA) activity test is a rapid test that has been used for the diagnosis of the pleural, peritoneal and pericardial forms of tuberculosis. However, the usefulness of ADA in TBM is uncertain. The aim of this study was to evaluate ADA as a diagnostic test for TBM in a systematic review. A systematic search was performed of the medical literature (MEDLINE, LILACS, Web of Science and EMBASE). The ADA values from TBM cases and controls (diagnosed with other types of meningitis) were necessary to calculate the sensitivity and specificity. Out of a total of 522 studies, 13 were included in the meta-analysis (380 patients with TBM). The sensitivity, specificity and diagnostic odds ratios (DOR) were calculated based on arbitrary ADA cut-off values from 1 to 10 U/l. ADA values from 1 to 4 U/l (sensitivity > 93% and specificity < 80%) helped to exclude TBM; values between 4 and 8 U/l were insufficient to confirm or exclude the diagnosis of TBM (p = 0.07), and values > 8 U/l (sensitivity < 59% and specificity > 96%) improved the diagnosis of TBM (p < 0.001). None of the cut-off values could be used to discriminate between TBM and bacterial meningitis. In conclusion, ADA cannot distinguish between bacterial meningitis and TBM, but using ranges of ADA values could be important to improve TBM diagnosis, particularly after bacterial meningitis has been ruled out. The different methods used to measure ADA and the heterogeneity of data do not allow standardization of this test as a routine.
Resumo:
Background and Objectives: Some authors states that the removal of lymph node would only contribute towards assessing the lymph node status and regional disease control, without any benefit for the patients` survival. The aim of this paper was to assess the influence of the number of surgically dissected pelvic lymph nodes (PLN) on disease-free Survival. Methods: Retrospective cohort study on 42 women presenting squamous cell carcinoma (SCC) of the uterine cervix, with metastases in PLN treated by radical surgery. The Cox model was used to identify risk factors for recurrence. The model variables were adjusted for treatment-related factors (year of treatment, surgical margins and postoperative radiotherapy). The cutoff value for classifying the lymphadenectomy as comprehensive (15 PLN or more) or non-comprehensive (<15 PLN) was determined from analysis of the ROC curve. Results: Fourteen recurrences (32.6%) were recorded: three pelvic, eight distant, two both pelvic and distant, and one at an unknown location. The following risk factors for recurrence were identified: invasion of the deep third of the cervix and number of dissected lymph nodes <15. Conclusions: Deep invasion and non-comprehensive pelvic lymphadenectomy are possible risk factors for recurrence of SCC of the uterine cervix with metastases in PLN. J. Surg. Oncol. 2009;100:252-257. (C) 2009 Wiley-Liss, Inc.
Resumo:
Reviews the literature to provide an overview of the historical significance of the elephant in Sri Lankan society, an association which dates back more than 4,000 years. The present status of this relationship assessed on the basis of the findings of a recent study undertaken on the total economic value of elephants in Sri Lanka. This paper, first briefly outlines the history, evolution, nature and their distribution of the Asian elephant while providing some insights on the status of the elephant (Elephas maxima maxima) in Sri Lanka. Next, it reviews the literature in order to assess the historical affiliation that the elephant has maintained with the Sri Lankan society, its culture, history, mythology and religion. The empirical evidence on the economic value of conservation of the remaining elephant population in Sri Lanka is reviewed and the Sri Lankan people’s attitudes towards conserving this species of wildlife. Literature reviewed and analysis undertaken indicates that the elephant in Sri Lanka, still, as in the past has a special place in Sri Lankan society, particularly, in its culture, religion and value system. Thus, there is a strong case for ensuring the survival of wild elephant population in Sri Lanka. Furthermore, it also suggests that the community as a whole will experience a net benefit from ensuring the survival of wild elephants in Sri Lanka.
Resumo:
Using a species’ population to measure its conservation status, this note explores how an increase in knowledge about this status would change the public’s willingness to donate funds for its conservation. This is done on the basis that the relationship between the level of donations and a species’ conservation status satisfies stated general mathematical properties. This level of donation increases, on average, with greater knowledge of a species’ conservation status if it is endangered, but falls if it is secure. Game theory and other theory is used to show how exaggerating the degree of endangerment of a species can be counterproductive for conservation.
Resumo:
Understanding the genetic architecture of quantitative traits can greatly assist the design of strategies for their manipulation in plant-breeding programs. For a number of traits, genetic variation can be the result of segregation of a few major genes and many polygenes (minor genes). The joint segregation analysis (JSA) is a maximum-likelihood approach for fitting segregation models through the simultaneous use of phenotypic information from multiple generations. Our objective in this paper was to use computer simulation to quantify the power of the JSA method for testing the mixed-inheritance model for quantitative traits when it was applied to the six basic generations: both parents (P-1 and P-2), F-1, F-2, and both backcross generations (B-1 and B-2) derived from crossing the F-1 to each parent. A total of 1968 genetic model-experiment scenarios were considered in the simulation study to quantify the power of the method. Factors that interacted to influence the power of the JSA method to correctly detect genetic models were: (1) whether there were one or two major genes in combination with polygenes, (2) the heritability of the major genes and polygenes, (3) the level of dispersion of the major genes and polygenes between the two parents, and (4) the number of individuals examined in each generation (population size). The greatest levels of power were observed for the genetic models defined with simple inheritance; e.g., the power was greater than 90% for the one major gene model, regardless of the population size and major-gene heritability. Lower levels of power were observed for the genetic models with complex inheritance (major genes and polygenes), low heritability, small population sizes and a large dispersion of favourable genes among the two parents; e.g., the power was less than 5% for the two major-gene model with a heritability value of 0.3 and population sizes of 100 individuals. The JSA methodology was then applied to a previously studied sorghum data-set to investigate the genetic control of the putative drought resistance-trait osmotic adjustment in three crosses. The previous study concluded that there were two major genes segregating for osmotic adjustment in the three crosses. Application of the JSA method resulted in a change in the proposed genetic model. The presence of the two major genes was confirmed with the addition of an unspecified number of polygenes.
Resumo:
Using examples from contempoary policy and business discourses, and exemplary historical texts dealing with the notion of value, I put forward an argument as to why a critical scholarship that draws on media history, language analysis, philosophy and political economy is necessary to understand the dynamics of what is being called 'the global knowledge economy'. I argue that the social changes associated with new modes of value determination are closely associated with new media form.
Resumo:
Background Diagnosis of the HIV-associated lipodystrophy syndrome is based on clinical assessment, in lack of a consensus about case definition and reference methods. Three bedside methods were compared in their diagnostic value for lipodystrophy. Patients and Methods. Consecutive HIV-infected outpatients (n = 278) were investigated, 128 of which also had data from 1997 available. Segmental bioelectrical impedance analysis (BIA) and waist, hip and thigh circumferences were performed. Changes in seven body regions were rated by physicians and patients using linear analogue scale assessment (LASA). Diagnostic cut-off values were searched by receiver operator characteristics. Results. Lipodystrophy was diagnosed in 85 patients (31%). BIA demonstrated higher fat-free mass in patients with lipodystrophy but not after controlling for body mass index and sex. Segmental BIA was not superior to whole body BIA in detecting lipodystrophy. Fat-free mass increased from 1997 to 1999 independent from lipodystrophy. Waist-hip and waist-thigh ratios were higher in patients with lipodystrophy. BIA, anthropometry and LASA did not provide sufficient diagnostic cut-off values for lipodystrophy. Agreement between methods, and between patient and physician rating, was poor. Conclusion: These methods do not fulfil the urgent need for quantitative diagnostic tools for lipodystrophy. BIA estimates of fat free mass may be biased by lipodystrophy, indicating a need for re-calibration in HIV infected populations. (C) 2001 Harcourt Publishers Ltd.
Resumo:
The effect of gamma-radiation on a perfluoroalkoxy (PFA) resin was examined using solid-state high-speed magic angle spinning (MAS) F-19 NMR spectroscopy. Samples were prepared for analysis by subjecting them to gamma-radiation in the dose range 0.5-3 MGy at either 303, 473, or 573 K. New structures identified include new saturated chain ends, short and long branches, and unsaturated groups. The formation of branched structures was found to increase with increasing irradiation temperature; however, at all temperatures the radiation chemical yield (G value) of new chain ends was greater than the G value of long branch points, suggesting that chain scission is the net process.
Resumo:
In the literature on firm strategy and product differentiation, consumer price-quality trade-offs are sometimes represented using consumer 'value maps'. These involve the geometric representation of indifferent price and quality combinations as points along curves that are concave to the 'quality' axis. In this paper, it is shown that the value map for price-quality tradeoffs may be derived from a Hicksian compensated demand curve for product quality. The paper provides the theoretical link between analytical methods employed in the existing literature on firm strategy and competitive advantage with the broader body of economic analysis.
Resumo:
The synthesis of helium in the early Universe depends on many input parameters, including the value of the gravitational coupling during the period when the nucleosynthesis takes place. We compute the primordial abundance of helium as function of the gravitational coupling, using a semi-analytical method, in order to track the influence of G in the primordial nucleosynthesis. To be specific, we construct a cosmological model with varying G, using the Brans-Dicke theory. The greater the value of G at nucleosynthesis period, the greater the predicted abundance of helium. Using the observational data for the abundance of primordial helium, constraints for the time variation of G are established.
Resumo:
The increasing availability of mobility data and the awareness of its importance and value have been motivating many researchers to the development of models and tools for analyzing movement data. This paper presents a brief survey of significant research works about modeling, processing and visualization of data about moving objects. We identified some key research fields that will provide better features for online analysis of movement data. As result of the literature review, we suggest a generic multi-layer architecture for the development of an online analysis processing software tool, which will be used for the definition of the future work of our team.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
The aim of this paper is to analyze the determining factors for the pricing of handsets sold with service plans, using the hedonic price method. This was undertaken by building a database comprising 48 handset models, under nine different service plans, over a period of 53 weeks in 2008, and resulted in 27 different attributes and a total number of nearly 300,000 data registers. The results suggest that the value of monthly subscriptions and calling minutes are important to explain the prices of handsets. Furthermore, both the physical volume and number of megapixels of a camera had an effect on the prices. The bigger the handset, the cheaper it becomes, and the more megapixels a camera phone has, the more expensive it becomes. Additionally, it was found that in 2008 Brazilian phone companies were subsidizing enabled data connection handsets.
Resumo:
ABSTRACT Earlier studies of cross-national differences in consumer behavior in different consumption sectors have verified that cultural differences have a strong influence on consumers. Despite the importance of cross-national analysis, no studies in the literature examine the moderating effects of nationality on the construction of behavioral intentions and their antecedents among cruise line passengers. This study investigates the moderating effects of nationality on the relationships between perceived value, satisfaction, trust and behavioral intentions among Spanish and (U.S.) American passengers of cruise lines that use Barcelona as home port and port-of-call. A theoretical model was tested with a total of 968 surveys. Structural equation models (SEMs) were used, by means of a multigroup analysis. Results of this study indicated that Spaniards showed stronger relationships between trust and behavioral intentions, and between emotional value and satisfaction. Americans presented stronger relationships between service quality and satisfaction, and between service quality and behavioral intentions.
Resumo:
Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.