948 resultados para Reproducibility of results
Resumo:
We systematically reviewed the safety and efficacy of perineural dexamethasone as an adjunct for peripheral nerve blockade in 29 controlled trials of 1695 participants. We grouped trials by the duration of local anaesthetic action (short- or medium- vs long-term). Dexamethasone increased the mean (95% CI) duration of analgesia by 233 (172-295) min when injected with short- or medium-term action local anaesthetics and by 488 (419-557) min when injected with long-term action local anaesthetics, p < 0.00001 for both. However, these results should be interpreted with caution due to the extreme heterogeneity of results, with I2 exceeding 90% for both analyses. Meta-regression did not show an interaction between dose of perineural dexamethasone (4-10 mg) and duration of analgesia (r2 = 0.02, p = 0.54). There were no differences between 4 and 8 mg dexamethasone on subgroup analysis.
Resumo:
This study tests the theory of rationing, examining changes in household consumption behavior during the transition to a market economy in Poland, 1987–92. A model of consumption under rationing is developed and fitted to prereform quarterly data from the Polish Household Budget Survey. Virtual prices, prices at which consumers would have voluntarily chosen the rationed levels of goods, are derived for food and housing. The prereform Almost Ideal Demand System (AIDS) model with rationing is estimated. Estimates from the virtual AIDS yield plausible values for price and income elasticities. The AIDS model (without rationing) is also fitted to postreform quarterly household survey data for comparison and evaluation. When the two sets of results are compared, the impacts of rationing are consistent with the theory. Own-price elasticities for nonrationed goods are larger after the reform, and there is increased complementarity and decreased substitutability for the nonrationed goods. The results for Poland show a 75 percent decline in real household welfare over the transition and this welfare loss is one-third the value obtained using reported prices.
Total knee arthroplasty - a clinical and numerical study of the micromovements of the tibial implant
Resumo:
Introduction The importance of the micromovements in the mechanism of aseptic loosening is clinically difficult to evaluate. To complete the analysis of a series of total knee arthroplasties (TKA), we used a tridimensional numerical model to study the micromovements of the tibial implant.Material and Methods Fifty one patients (with 57 cemented Porous Coated Anatomic TKAs) were reviewed (mean follow-up 4.5 year). Radiolucency at the tibial bone-cement interface was sought on the AP radiographs and divided in 7 areas. The distribution of the radiolucency was then correlated with the axis of the lower limb as measured on the orthoradiograms.The tridimensional numerical model is based on the finite element method. It allowed the measurement of the cemented prosthetic tibial implant's displacements and the microvements generated at bone-ciment interface. A total load (2000 Newton) was applied at first vertically and asymetrically on the tibial plateau, thereby simulating an axial deviation of the lower limbs. The vector's posterior inclination then permitted the addition of a tangential component to the axial load. This type of effort is generated by complex biomechanical phenomena such as knee flexion.Results 81 per cent of the 57 knees had a radiolucent line of at least 1 mm, at one or more of the tibial cement-epiphysis jonctional areas. The distribution of these lucent lines showed that they came out more frequently at the periphery of the implant. The lucent lines appeared most often under the unloaded margin of the tibial plateau, when axial deviation of lower limbs was present.Numerical simulations showed that asymetrical loading on the tibial plateau induced a subsidence of the loaded margin (0-100 microns) and lifting off at the opposite border (0-70 microns). The postero-anterior tangential component induced an anterior displacement of the tibial implant (160-220 microns), and horizontal micromovements with non homogenous distribution at the bone-ciment interface (28-54 microns).Discussion Comparison of clinical and numerical results showed a relation between the development of radiolucent lines and the unloading of the tibial implant's margin. The deleterious effect of lower limbs' axial deviation is thereby proven. The irregular distribution of lucent lines under the tibial plateau was similar of the micromovements' repartition at the bone-cement interface when tangential forces were present. A causative relation between the two phenomenaes could not however be established.Numerical simulation is a truly useful method of study; it permits to calculate micromovements which are relative, non homogenous and of very low amplitude. However, comparative clinical studies remain as essential to ensure the credibility of results.
Resumo:
Valganciclovir (VGC) is an oral prodrug of ganciclovir (GCV) recently introduced for prophylaxis and treatment of cytomegalovirus infection. Optimal concentration exposure for effective and safe VGC therapy would require either reproducible VGC absorption and GCV disposition or dosage adjustment based on therapeutic drug monitoring (TDM). We examined GCV population pharmacokinetics in solid organ transplant recipients receiving oral VGC, including the influence of clinical factors, the magnitude of variability, and its impact on efficacy and tolerability. Nonlinear mixed effect model (NONMEM) analysis was performed on plasma samples from 65 transplant recipients under VGC prophylaxis or treatment. A two-compartment model with first-order absorption appropriately described the data. Systemic clearance was markedly influenced by the glomerular filtration rate (GFR), patient gender, and graft type (clearance/GFR = 1.7 in kidney, 0.9 in heart, and 1.2 in lung and liver recipients) with interpatient and interoccasion variabilities of 26 and 12%, respectively. Body weight and sex influenced central volume of distribution (V(1) = 0.34 liter/kg in males and 0.27 liter/kg in females [20% interpatient variability]). No significant drug interaction was detected. The good prophylactic efficacy and tolerability of VGC precluded the demonstration of any relationship with GCV concentrations. In conclusion, this analysis highlights the importance of thorough adjustment of VGC dosage to renal function and body weight. Considering the good predictability and reproducibility of the GCV profile after treatment with oral VGC, routine TDM does not appear to be clinically indicated in solid-organ transplant recipients. However, GCV plasma measurement may still be helpful in specific clinical situations.
Resumo:
Objective To analyze the methodological aspects used for the preparation of terminology subsets of the International Classification for Nursing Practice (ICNP®), in dissertations and theses in the Brazilian nursing. Method This is an integrative review of the Brazilian dissertations and theses defended in the period from 2007 to 2013, which were included seven dissertations. Results The increasing production of studies on the theme by Brazilian nurses shows a concern for a unified language for the profession. However, the results demonstrate the lack of uniformity in the conduct of studies, especially in relation to the stages of content validation. The initiatives of some authors to systematize alternative methods for creating these subsets also stood out. Conclusion We suggest the development of new terminology subsets, following standards of methodological rigor, as well as its application and validation by the selected clientele, to ensure greater reliability of results and desired changes for the profession.
Resumo:
Purpose of the study: To investigate the impact of ART, HIV viremia and immunosuppression on triglyceride (TG), total cholesterol (TC) and high density lipoprotein cholesterol (HDL-C) levels. Methods: We considered the cross-sectional associations between TG, TC and HDL-C (mmol/l; first available measurement on/after enrolment in the D:A:D study) and use of ART, HIV viral load (VL; copies/ml), and CD4 count (cells/mm3) measured at the same time. TG was log10 transformed to ensure normality. Analyses were performed using linear regression and adjusted for other factors known to impact lipid levels (table footnote). ART and VL status were combined (off ART&VL _100,000, off ART&VL B100,000, on ART&VL B500, on ART&VL _500), current and nadir CD4 count were categorised as B200, 200_349, 350_499 and _500. Summary of results: 44,322/49,734 participants in the D:A:D Study (89.1%) contributed a TG measurement (median; IQR 1.52; 1.00_ 2.45), 45,169 (90.8%) a TC measurement (4.80; 4.00_5.70) and 38,604 (77.6%) a HDL-C measurement (1.12; 0.90_1.40). Most participants were male (74%), of white ethnicity (51%), without AIDS (78%), were not receiving lipid-lowering drugs (4%) and were ART experienced (61%) with 47% previously exposed to PIs, 61% previously exposed to NRTIs and 29% previously exposed to NNRTIs. The median (IQR) age, current CD4 count and CD4 nadir were 38 (36_45) years, 400 (242_590) cells/ml and 240 (100_410) cells/ml respectively. Compared to those on ART with a suppressed VL, all lipids were lower for those off ART (Table); non-suppressive ART was also associated with lower TC and HDL-C levels (no impact on TG). A low current CD4 count was associated with lower lipid levels, whereas a low nadir CD4 count was associated with higher TC and TG levels. Prior AIDS diagnosis was associated with higher TG and TC, but lower HDL-C levels. Conclusion: Although specific drug classes were not considered, lipid levels are considerably higher in those on a suppressive ART regimen. The higher TC/TG and lower HDL-C levels seen among those with low nadir CD4 count and with a prior AIDS diagnosis suggests severe immunosuppression may be associated with dyslipidaemia over the long-term.
Resumo:
PURPOSE: To suppress the noise, by sacrificing some of the signal homogeneity for numerical stability, in uniform T1 weighted (T1w) images obtained with the magnetization prepared 2 rapid gradient echoes sequence (MP2RAGE) and to compare the clinical utility of these robust T1w images against the uniform T1w images. MATERIALS AND METHODS: 8 healthy subjects (29.0±4.1 years; 6 Male), who provided written consent, underwent two scan sessions within a 24 hour period on a 7T head-only scanner. The uniform and robust T1w image volumes were calculated inline on the scanner. Two experienced radiologists qualitatively rated the images for: general image quality; 7T specific artefacts; and, local structure definition. Voxel-based and volume-based morphometry packages were used to compare the segmentation quality between the uniform and robust images. Statistical differences were evaluated by using a positive sided Wilcoxon rank test. RESULTS: The robust image suppresses background noise inside and outside the skull. The inhomogeneity introduced was ranked as mild. The robust image was significantly ranked higher than the uniform image for both observers (observer 1/2, p-value = 0.0006/0.0004). In particular, an improved delineation of the pituitary gland, cerebellar lobes was observed in the robust versus uniform T1w image. The reproducibility of the segmentation results between repeat scans improved (p-value = 0.0004) from an average volumetric difference across structures of ≈6.6% to ≈2.4% for the uniform image and robust T1w image respectively. CONCLUSIONS: The robust T1w image enables MP2RAGE to produce, clinically familiar T1w images, in addition to T1 maps, which can be readily used in uniform morphometry packages.
Resumo:
Much of empirical economics involves regression analysis. However, does thepresentation of results affect economists ability to make inferences for decision makingpurposes? In a survey, 257 academic economists were asked to make probabilisticinferences on the basis of the outputs of a regression analysis presented in a standardformat. Questions concerned the distribution of the dependent variable conditional onknown values of the independent variable. However, many respondents underestimateduncertainty by failing to take into account the standard deviation of the estimatedresiduals. The addition of graphs did not substantially improve inferences. On the otherhand, when only graphs were provided (i.e., with no statistics), respondents weresubstantially more accurate. We discuss implications for improving practice in reportingresults of regression analyses.
Resumo:
Purpose - There has been much research on manufacturing flexibility, but supply chain flexibility is still an under-investigated area. This paper focuses on supply flexibility, the aspects of flexibility related to the upstream supply chain. Our purpose is to investigate why and how firms increase supply flexibility.Methodology/Approach An exploratory multiple case study was conducted. We analyzed seven Spanish manufacturers from different sectors (automotive, apparel, electronics and electrical equipment).Findings - The results show that there are some major reasons why firms need supply flexibility (manufacturing schedule fluctuations, JIT purchasing, manufacturing slack capacity, low level of parts commonality, demand volatility, demand seasonality and forecast accuracy), and that companies increase this type of flexibility by implementing two main strategies: to increase suppliers responsiveness capability and flexible sourcing . The results also suggest that the supply flexibility strategy selected depends on two factors: the supplier searching and switching costs and the type of uncertainty (mix, volume or delivery).Research limitations - This paper has some limitations common to all case studies, such as the subjectivity of the analysis, and the questionable generalizability of results (since the sample of firms is not statistically significant).Implications - Our study contributes to the existing literature by empirically investigating which are the main reasons for companies needing to increase supply flexibility, how they increase this flexibility, and suggesting some factors that could influence the selection of a particular supply flexibility strategy.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
OBJECTIVES: To identify factors associated with discrepant outcome reporting in randomized drug trials. STUDY DESIGN AND SETTING: Cohort study of protocols submitted to a Swiss ethics committee 1988-1998: 227 protocols and amendments were compared with 333 matching articles published during 1990-2008. Discrepant reporting was defined as addition, omission, or reclassification of outcomes. RESULTS: Overall, 870 of 2,966 unique outcomes were reported discrepantly (29.3%). Among protocol-defined primary outcomes, 6.9% were not reported (19 of 274), whereas 10.4% of reported outcomes (30 of 288) were not defined in the protocol. Corresponding percentages for secondary outcomes were 19.0% (284 of 1,495) and 14.1% (334 of 2,375). Discrepant reporting was more likely if P values were <0.05 compared with P ≥ 0.05 [adjusted odds ratio (aOR): 1.38; 95% confidence interval (CI): 1.07, 1.78], more likely for efficacy compared with harm outcomes (aOR: 2.99; 95% CI: 2.08, 4.30) and more likely for composite than for single outcomes (aOR: 1.48; 95% CI: 1.00, 2.20). Cardiology (aOR: 2.34; 95% CI: 1.44, 3.79) and infectious diseases (aOR: 1.77; 95% CI: 1.01, 3.13) had more discrepancies compared with all specialties combined. CONCLUSION: Discrepant reporting was associated with statistical significance of results, type of outcome, and specialty area. Trial protocols should be made freely available, and the publications should describe and justify any changes made to protocol-defined outcomes.
Resumo:
Several ink dating methods based on solvents analysis using gas chromatography/mass spectrometry (GC/MS) were proposed in the last decades. These methods follow the drying of solvents from ballpoint pen inks on paper and seem very promising. However, several questions arose over the last few years among questioned documents examiners regarding the transparency and reproducibility of the proposed techniques. These questions should be carefully studied for accurate and ethical application of this methodology in casework. Inspired by a real investigation involving ink dating, the present paper discusses this particular issue throughout four main topics: aging processes, dating methods, validation procedures and data interpretation. This work presents a wide picture of the ink dating field, warns about potential shortcomings and also proposes some solutions to avoid reporting errors in court.
Resumo:
The interest in solar ultraviolet (UV) radiation from the scientific community and the general population has risen significantly in recent years because of the link between increased UV levels at the Earth's surface and depletion of ozone in the stratosphere. As a consequence of recent research, UV radiation climatologies have been developed, and effects of some atmospheric constituents (such as ozone or aerosols) have been studied broadly. Correspondingly, there are well-established relationships between, for example, total ozone column and UV radiation levels at the Earth's surface. Effects of clouds, however, are not so well described, given the intrinsic difficulties in properly describing cloud characteristics. Nevertheless, the effect of clouds cannot be neglected, and the variability that clouds induce on UV radiation is particularly significant when short timescales are involved. In this review we show, summarize, and compare several works that deal with the effect of clouds on UV radiation. Specifically, works reviewed here approach the issue from the empirical point of view: Some relationship between measured UV radiation in cloudy conditions and cloud-related information is given in each work. Basically, there are two groups of methods: techniques that are based on observations of cloudiness (either from human observers or by using devices such as sky cameras) and techniques that use measurements of broadband solar radiation as a surrogate for cloud observations. Some techniques combine both types of information. Comparison of results from different works is addressed through using the cloud modification factor (CMF) defined as the ratio between measured UV radiation in a cloudy sky and calculated radiation for a cloudless sky. Typical CMF values for overcast skies range from 0.3 to 0.7, depending both on cloud type and characteristics. Despite this large dispersion of values corresponding to the same cloud cover, it is clear that the cloud effect on UV radiation is 15–45% lower than the cloud effect on total solar radiation. The cloud effect is usually a reducing effect, but a significant number of works report an enhancement effect (that is increased UV radiation levels at the surface) due to the presence of clouds. The review concludes with some recommendations for future studies aimed to further analyze the cloud effects on UV radiation
Resumo:
Glioma cell lines are an important tool for research in basic and translational neuro-oncology. Documentation of their genetic identity has become a requirement for scientific journals and grant applications to exclude cross-contamination and misidentification that lead to misinterpretation of results. Here, we report the standard 16 marker short tandem repeat (STR) DNA fingerprints for a panel of 39 widely used glioma cell lines as reference. Comparison of the fingerprints among themselves and with the large DSMZ database comprising 9 marker STRs for 2278 cell lines uncovered 3 misidentified cell lines and confirmed previously known cross-contaminations. Furthermore, 2 glioma cell lines exhibited identity scores of 0.8, which is proposed as the cutoff for detecting cross-contamination. Additional characteristics, comprising lack of a B-raf mutation in one line and a similarity score of 1 with the original tumor tissue in the other, excluded a cross-contamination. Subsequent simulation procedures suggested that, when using DNA fingerprints comprising only 9 STR markers, the commonly used similarity score of 0.8 is not sufficiently stringent to unambiguously differentiate the origin. DNA fingerprints are confounded by frequent genetic alterations in cancer cell lines, particularly loss of heterozygosity, that reduce the informativeness of STR markers and, thereby, the overall power for distinction. The similarity score depends on the number of markers measured; thus, more markers or additional cell line characteristics, such as information on specific mutations, may be necessary to clarify the origin.
Resumo:
The diagnosis of synovial amyloidosis is based upon synovial biopsy. Synovial fluid (SF) in seven patients with amyloid arthropathy associated with chronic renal failure undergoing haemodialysis were studied. The SF and synovial samples of 10 consecutive patients with seronegative mono- or oligoarthritis served as controls. Six of the seven patients with amyloid positive synovial biopsy specimens showed amyloid in their SF. No amyloid was found in the synovial tissue or fluid of the 10 patients in the control group, the sensitivity being 87.7%. The finding of amyloid in SF was highly reproducible, showing its presence in the same joint on several occasions. The deposits were Congophilia resistant to potassium permanganate pretreatment, and the immunohistochemical analysis proved that they contained beta 2 microglobulin. The high sensitivity and good reproducibility of the method shows that the finding of amyloid in SF is sufficient for the diagnosis of synovial amyloidosis. It is possible to perform immunohis