969 resultados para 350107 Other Accounting
Resumo:
OBJECTIVE To assess the effectiveness of glatiramer acetate (GA) compared to other multiple sclerosis (MS) therapies in routine clinical practice. MATERIALS AND METHODS Observational cohort study carried out in MS patients treated with GA (GA cohort) or other MS therapies -switched from GA- (non-GA cohort). Study data were obtained through review of our MS patient database. The primary endpoint was the Expanded Disability Status Scale (EDSS) scores reached at the end of treatment/last check-up. RESULTS A total of 180 patients were included: GA cohort n = 120, non-GA cohort n = 60. Patients in the GA cohort showed better EDSS scores at the end of treatment/last check-up (mean ± SD, 2.8 ± 1.8 vs. 3.9 ± 2.2; P = 0.001) and were 1.65 times more likely to show better EDSS scores compared to the non-GA cohort (odds ratio, 0.606; 95%CI, 0.436-0.843; P = 0.003). Patients in the GA cohort showed longer mean time to reach EDSS scores of 6 (209.1 [95%CI, 187.6-230.6] vs. 164.3 [95% CI, 137.0-191.6] months; P = 0.004) and slower disability progression (hazard ratio, 0.415 [95%CI, 0.286-0.603]; P < 0.001). The annualized relapse rate was lower in the GA cohort (mean ± SD, 0.5 ± 0.5 vs. 0.8 ± 0.5; P = 0.001) and patients' quality of life was improved in this study cohort compared to the non-GA cohort (mean ± SD, 0.7 ± 0.1 vs. 0.6 ± 0.2; P = 0.01). CONCLUSIONS GA may slow down the progression of EDSS scores to a greater extent than other MS therapies, as well as achieving a greater reduction in relapses and a greater improvement in patients' quality of life. Switching from GA to other MS therapies has not proved to entail a better response to treatment.
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
Abstract :This article examines the interplay of text and image in The Fairy Tales of Charles Perrault (1977), translated by Angela Carter and illustrated by Martin Ware, as a form of intersemiotic dialogue that sheds new light on Carter's work. It argues that Ware's highly original artwork based on the translation not only calls into question the association of fairy tales with children's literature (which still characterizes Carter's translation), but also captures an essential if heretofore neglected aspect of Carter's creative process, namely the dynamics between translating, illustrating and rewriting classic tales. Several elements from Ware's illustrations are indeed taken up and elaborated on in The Bloody Chamber and Other Stories (1979), the collection of "stories about fairy stories" that made Carter famous. These include visual details and strategies that she transposed to the realm of writing, giving rise to reflections on the relation between visuality and textuality.RésuméCet article considère l'interaction du texte et de l'image dans les contes de Perrault traduits par Angela Carter et illustrés par Martin Ware (The Fairy Tales of Charles Perrault, 1977) comme une forme de dialogue intersémiotique particulièrement productif. Il démontre que les illustrations originales de Ware ne mettent pas seulement en question l'assimilation des contes à la littérature de jeunesse (qui est encore la perspective adoptée par la traductrice dans ce livre), mais permettent aussi de saisir un aspect essentiel bien que jusque là ignoré du procession de création dans l'oeuvre de Carter, à savoir la dynamique qui lie la traduction, l'illustration et la réécriture des contes classiques. Plusieurs éléments des illustrations de Ware sont ainsi repris et élaborés dans The Bloody Chamber and Other Stories (1979), la collection de "stories about fairy stories" qui rendit Carter célèbre. La transposition de détails et de stratégies visuelles dans l'écriture donnent ainsi l'occasion de réflexions sur les rapports entre la visualité et la textualité.
Resumo:
BACKGROUND Nucleic acid amplification tests are increasingly used for the rapid diagnosis of tuberculosis. We undertook a comparative study of the efficiency and diagnostic yield of a real-time PCR senX3-regX3 based assay versus the classical IS6110 target and the new commercial methods. METHODS This single-blind prospective comparative study included 145 consecutive samples: 76 from patients with culture-confirmed tuberculosis (86.8% pulmonary and 13.2% extrapulmonary tuberculosis: 48.7% smear-positive and 51.3% smear-negative) and 69 control samples (24 from patients diagnosed with non-tuberculous mycobacteria infections and 45 from patients with suspected tuberculosis which was eventually ruled out). All samples were tested by two CE-marked assays (Xpert®MTB/RIF and AnyplexTM plus MTB/NTM) and two in-house assays targeting senX3-regX3 and the IS6110 gene. RESULTS The detection limit ranged from 1.00E+01 fg for Anyplex, senX3-regX3 and IS6110 to 1.00E+04 fg for Xpert. All three Xpert, senX3-regX3 and IS6110 assays detected all 37 smear-positive cases. Conversely, Anyplex was positive in 34 (91.9%) smear-positive cases. In patients with smear-negative tuberculosis, differences were observed between the assays; Xpert detected 22 (56.41%) of the 39 smear-negative samples, Anyplex 24 (61.53%), senX3-regX3 28 (71.79%) and IS6110 35 (89.74%). Xpert and senX3-regX3 were negative in all control samples; however, the false positive rate was 8.7% and 13% for Anyplex and IS6110, respectively. The overall sensitivity was 77.6%, 85.7%, 77.3% and 94.7% and the specificity was 100%, 100%, 90.8% and 87.0% for the Xpert, senX3-regX3, Anyplex and IS6110 assays, respectively. CONCLUSION Real-time PCR assays targeting IS6110 lack the desired specificity. The Xpert MTB/RIF and in-house senX3-regX3 assays are both sensitive and specific for the detection of MTBC in both pulmonary and extrapulmonary samples. Therefore, the real time PCR senX3-regX3 based assay could be a useful and complementary tool in the diagnosis of tuberculosis.
Resumo:
BACKGROUND: Studies about beverage preferences in a country in which wine drinking is relatively widespread (like Switzerland) are scarce. Therefore, the main aims of the present study were to examine the associations between beverage preferences and drinking patterns, alcohol-related consequences and the use of other substances among Swiss young men. METHODS: The analytical sample consisted of 5399 Swiss men who participated in the Cohort Study on Substance Use Risk Factors (C-SURF) and had been drinking alcohol over the preceding 12 months. Logistic regression analyses were conducted to study the associations between preference for a particular beverage and (i) drinking patterns, (ii) negative alcohol-related consequences and (iii) the (at-risk) use of cigarettes, cannabis and other illicit drugs. RESULTS: Preference for beer was associated with risky drinking patterns and, comparable with a preference for strong alcohol, with the use of illicit substances (cannabis and other illicit drugs). In contrast, a preference for wine was associated with low-risk alcohol consumption and a reduced likelihood of experiencing at least four negative alcohol-related consequences or of daily cigarette smoking. Furthermore, the likelihood of negative outcomes (alcohol-related consequences; use of other substances) increased among people with risky drinking behaviours, independent of beverage preference. CONCLUSIONS: In our survey, beer preference was associated with risky drinking patterns and illicit drug use. Alcohol polices to prevent large quantities of alcohol consumption, especially of cheaper spirits like beer, should be considered to reduce total alcohol consumption and the negative consequences associated with these beverage types.
Resumo:
Recently, the surprising result that ab initio calculations on benzene and other planar arenes at correlated MP2, MP3, configuration interaction with singles and doubles (CISD), and coupled cluster with singles and doubles levels of theory using standard Pople’s basis sets yield nonplanar minima has been reported. The planar optimized structures turn out to be transition states presenting one or more large imaginary frequencies, whereas single-determinant-based methods lead to the expected planar minima and no imaginary frequencies. It has been suggested that such anomalous behavior can be originated by two-electron basis set incompleteness error. In this work, we show that the reported pitfalls can be interpreted in terms of intramolecular basis set superposition error (BSSE) effects, mostly between the C–H moieties constituting the arenes. We have carried out counterpoise-corrected optimizations and frequency calculations at the Hartree–Fock, B3LYP, MP2, and CISD levels of theory with several basis sets for a number of arenes. In all cases, correcting for intramolecular BSSE fixes the anomalous behavior of the correlated methods, whereas no significant differences are observed in the single-determinant case. Consequently, all systems studied are planar at all levels of theory. The effect of different intramolecular fragment definitions and the particular case of charged species, namely, cyclopentadienyl and indenyl anions, respectively, are also discussed
Resumo:
AIM: Atomic force microscopy nanoindentation of myofibers was used to assess and quantitatively diagnose muscular dystrophies from human patients. MATERIALS & METHODS: Myofibers were probed from fresh or frozen muscle biopsies from human dystrophic patients and healthy volunteers, as well as mice models, and Young's modulus stiffness values were determined. RESULTS: Fibers displaying abnormally low mechanical stability were detected in biopsies from patients affected by 11 distinct muscle diseases, and Young's modulus values were commensurate to the severity of the disease. Abnormal myofiber resistance was also observed from consulting patients whose muscle condition could not be detected or unambiguously diagnosed otherwise. DISCUSSION & CONCLUSION: This study provides a proof-of-concept that atomic force microscopy yields a quantitative read-out of human muscle function from clinical biopsies, and that it may thereby complement current muscular dystrophy diagnosis.
Resumo:
Managing fisheries resources to maintain healthy ecosystems is one of the main goals of the ecosystem approach to fisheries (EAF). While a number of international treaties call for the implementation of EAF, there are still gaps in the underlying methodology. One aspect that has received substantial scientific attention recently is fisheries-induced evolution (FIE). Increasing evidence indicates that intensive fishing has the potential to exert strong directional selection on life-history traits, behaviour, physiology, and morphology of exploited fish. Of particular concern is that reversing evolutionary responses to fishing can be much more difficult than reversing demographic or phenotypically plastic responses. Furthermore, like climate change, multiple agents cause FIE, with effects accumulating over time. Consequently, FIE may alter the utility derived from fish stocks, which in turn can modify the monetary value living aquatic resources provide to society. Quantifying and predicting the evolutionary effects of fishing is therefore important for both ecological and economic reasons. An important reason this is not happening is the lack of an appropriate assessment framework. We therefore describe the evolutionary impact assessment (EvoIA) as a structured approach for assessing the evolutionary consequences of fishing and evaluating the predicted evolutionary outcomes of alternative management options. EvoIA can contribute to EAF by clarifying how evolution may alter stock properties and ecological relations, support the precautionary approach to fisheries management by addressing a previously overlooked source of uncertainty and risk, and thus contribute to sustainable fisheries.
Resumo:
This paper relaxes the standard I(0) and I(1) assumptions typically stated in the monetary VAR literature by considering a richer framework that encompasses the previous two processes as well as other fractionally integrated possibilities. First, a timevarying multivariate spectrum is estimated for post WWII US data. Then, a structural fractionally integrated VAR (VARFIMA) is fitted to each of the resulting time dependent spectra. In this way, both the coefficients of the VAR and the innovation variances are allowed to evolve freely. The model is employed to analyze inflation persistence and to evaluate the stance of US monetary policy. Our findings indicate a strong decline in the innovation variances during the great disinflation, consistent with the view that the good performance of the economy during the 80’s and 90’s is in part a tale of good luck. However, we also find evidence of a decline in inflation persistence together with a stronger monetary response to inflation during the same period. This last result suggests that the Fed may still play a role in accounting for the observed differences in the US inflation history. Finally, we conclude that previous evidence against drifting coefficients could be an artifact of parameter restriction towards the stationary region. Keywords: monetary policy, inflation persistence, fractional integration, timevarying coefficients, VARFIMA. JEL Classification: E52, C32
Resumo:
Alterations of the p53 pathway are among the most frequent aberrations observed in human cancers. We have performed an exhaustive analysis of TP53, p14, p15, and p16 status in a large series of 143 soft tissue sarcomas, rare tumors accounting for around 1% of all adult cancers, with complex genetics. For this purpose, we performed genomic studies, combining sequencing, copy number assessment, and expression analyses. TP53 mutations and deletions are more frequent in leiomyosarcomas than in undifferentiated pleomorphic sarcomas. Moreover, 50% of leiomyosarcomas present TP53 biallelic inactivation, whereas most undifferentiated pleomorphic sarcomas retain one wild-type TP53 allele (87.2%). The spectrum of mutations between these two groups of sarcomas is different, particularly with a higher rate of complex mutations in undifferentiated pleomorphic sarcomas. Most tumors without TP53 alteration exhibit a deletion of p14 and/or lack of mRNA expression, suggesting that p14 loss could be an alternative genotype for direct TP53 inactivation. Nevertheless, the fact that even in tumors altered for TP53, we could not detect p14 protein suggests that other p14 functions, independent of p53, could be implicated in sarcoma oncogenesis. In addition, both p15 and p16 are frequently codeleted or transcriptionally co-inhibited with p14, essentially in tumors with two wild-type TP53 alleles. Conversely, in TP53-altered tumors, p15 and p16 are well expressed, a feature not incompatible with an oncogenic process.