60 resultados para New statistics for monitoring
Resumo:
Exercise intolerance may be reported by parents of young children with respiratory diseases. There is, however, a lack of standardized exercise protocols which allow verification of these reports especially in younger children. Consequently the aims of this pilot study were to develop a standardized treadmill walking test for children aged 4-10 years demanding low sensorimotor skills and achieving high physical exhaustion. In a prospective experimental cross sectional pilot study, 33 healthy Caucasian children were separated into three groups: G1 (4-6 years, n = 10), G2 (7-8 years, n = 12), and G3 (9-10 years, n = 11). Children performed the treadmill walking test with increasing exercise levels up to peak condition with maximal exhaustion. Gas exchange, heart rate, and lactate were measured during the test, spirometry before and after. Parameters were statistically calculated at all exercise levels as well as at 2 and 4 mmol/L lactate level for group differences (Kruskal-Wallis H-test, alpha = 0.05; post hoc: Mann-Whitney U-test with Bonferroni correction alpha = 0.05/n) and test-retest differences (Wilcoxon-rank-sum test) with SPSS. The treadmill walking test could be demonstrated to be feasible with a good repeatability within groups for most of the parameters. All children achieved a high exhaustion level. At peak level under exhaustion condition only the absolute VO2 and VCO2 differed significantly between age groups. In conclusion this newly designed treadmill walking test indicates a good feasibility, safety, and repeatability. It suggests the potential usefulness of exercise capacity monitoring for children aged from early 4 to 10 years. Various applications and test modifications will be investigated in further studies.
Resumo:
Various treatment options for deep cartilage defects are presently available. The efficacy of bone marrow stimulation with microfracture, of mosaicplasty and of various autologous chondrocyte implantation (ACI) techniques has been subject to numerous studies recently. Magnetic resonance imaging (MRI) has gained a major role in the assessment of cartilage repair. The introduction of high-field MRI to clinical routine makes high resolution and three-dimensional imaging readily available. New quantitative MRI techniques that directly visualize the molecular structure of cartilage may further advance our understanding of cartilage repair. The clinical evaluation of cartilage repair tissue is a complex issue, and MR imaging will become increasingly important both in research and in clinical routine. This article reviews the clinical aspects of microfracture, mosaicplasty, and ACI and reports the recent technical advances that have improved MRI of cartilage. Morphological evaluation methods are recommended for each of the respective techniques. Finally, an overview of T2 mapping and delayed gadolinium-enhanced MR imaging of cartilage in cartilage repair is provided.
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
Questionnaire data may contain missing values because certain questions do not apply to all respondents. For instance, questions addressing particular attributes of a symptom, such as frequency, triggers or seasonality, are only applicable to those who have experienced the symptom, while for those who have not, responses to these items will be missing. This missing information does not fall into the category 'missing by design', rather the features of interest do not exist and cannot be measured regardless of survey design. Analysis of responses to such conditional items is therefore typically restricted to the subpopulation in which they apply. This article is concerned with joint multivariate modelling of responses to both unconditional and conditional items without restricting the analysis to this subpopulation. Such an approach is of interest when the distributions of both types of responses are thought to be determined by common parameters affecting the whole population. By integrating the conditional item structure into the model, inference can be based both on unconditional data from the entire population and on conditional data from subjects for whom they exist. This approach opens new possibilities for multivariate analysis of such data. We apply this approach to latent class modelling and provide an example using data on respiratory symptoms (wheeze and cough) in children. Conditional data structures such as that considered here are common in medical research settings and, although our focus is on latent class models, the approach can be applied to other multivariate models.
Resumo:
BACKGROUND: Complete investigation of thrombophilic or hemorrhagic clinical presentations is a time-, apparatus-, and cost-intensive process. Sensitive screening tests for characterizing the overall function of the hemostatic system, or defined parts of it, would be very useful. For this purpose, we are developing an electrochemical biosensor system that allows measurement of thrombin generation in whole blood as well as in plasma. METHODS: The measuring system consists of a single-use electrochemical sensor in the shape of a strip and a measuring unit connected to a personal computer, recording the electrical signal. Blood is added to a specific reagent mixture immobilized in dry form on the strip, including a coagulation activator (e.g., tissue factor or silica) and an electrogenic substrate specific to thrombin. RESULTS: Increasing thrombin concentrations gave standard curves with progressively increasing maximal current and decreasing time to reach the peak. Because the measurement was unaffected by color or turbidity, any type of blood sample could be analyzed: platelet-poor plasma, platelet-rich plasma, and whole blood. The test strips with the predried reagents were stable when stored for several months before testing. Analysis of the combined results obtained with different activators allowed discrimination between defects of the extrinsic, intrinsic, and common coagulation pathways. Activated protein C (APC) predried on the strips allowed identification of APC-resistance in plasma and whole blood samples. CONCLUSIONS: The biosensor system provides a new method for assessing thrombin generation in plasma or whole blood samples as small as 10 microL. The assay is easy to use, thus allowing it to be performed in a point-of-care setting.
Gastroesophageal reflux and pulmonary fibrosis in scleroderma: a study using pH-impedance monitoring
Resumo:
RATIONALE: Interstitial lung disease (ILD) in patients with systemic sclerosis (SSc) is associated with increased morbidity and mortality. Gastroesophageal reflux (GER) is considered a contributing factor in the pathogenesis of ILD. OBJECTIVES: To characterize GER (acid and nonacid) in patients with SSc with and without ILD. METHODS: Patients with SSc underwent pulmonary high-resolution computer tomography (HRCT) scan and 24-hour impedance-pH monitoring off-proton pump inhibitor therapy. The presence of pulmonary fibrosis was assessed using validated HRCT-scores. Reflux monitoring parameters included number of acid and nonacid reflux episodes, proximal migration of the refluxate, and distal esophageal acid exposure. Unless otherwise specified, data are presented as median (25th-75th percentile). MEASUREMENTS AND MAIN RESULTS: Forty consecutive patients with SSc (35 female; mean age, 53 yr; range, 24-71; 15 patients with diffuse and 25 with limited SSc) were investigated; 18 (45%) patients with SSc had pulmonary fibrosis (HRCT score >or= 7). Patients with SSc with ILD had higher (P < 0.01) esophageal acid exposure (10.3 [7.5-15] vs. 5.2 [1.5-11]), higher (P < 0.01) number of acid (41 [31-58] vs. 19 [10-23]) and nonacid (25 [20-35] vs. 17 [11-19]) reflux episodes, and higher (P < 0.01) number of reflux episodes reaching the proximal esophagus (42.5 [31-54] vs. 15 [8-22]) compared with patients with SSc with normal HRCT scores. Pulmonary fibrosis scores (HRCT score) correlated well with the number of reflux episodes in the distal (r(2) = 0.637) and proximal (r(2) = 0.644) esophagus. CONCLUSIONS: Patients with SSc with ILD have more severe reflux (i.e., more reflux episodes and more reflux reaching the proximal esophagus). Whether or not the development of ILD in patients with SSc can be prevented by reflux-reducing treatments needs to be investigated.
Resumo:
The rise of evidence-based medicine as well as important progress in statistical methods and computational power have led to a second birth of the >200-year-old Bayesian framework. The use of Bayesian techniques, in particular in the design and interpretation of clinical trials, offers several substantial advantages over the classical statistical approach. First, in contrast to classical statistics, Bayesian analysis allows a direct statement regarding the probability that a treatment was beneficial. Second, Bayesian statistics allow the researcher to incorporate any prior information in the analysis of the experimental results. Third, Bayesian methods can efficiently handle complex statistical models, which are suited for advanced clinical trial designs. Finally, Bayesian statistics encourage a thorough consideration and presentation of the assumptions underlying an analysis, which enables the reader to fully appraise the authors' conclusions. Both Bayesian and classical statistics have their respective strengths and limitations and should be viewed as being complementary to each other; we do not attempt to make a head-to-head comparison, as this is beyond the scope of the present review. Rather, the objective of the present article is to provide a nonmathematical, reader-friendly overview of the current practice of Bayesian statistics coupled with numerous intuitive examples from the field of oncology. It is hoped that this educational review will be a useful resource to the oncologist and result in a better understanding of the scope, strengths, and limitations of the Bayesian approach.
Resumo:
Physiology and current knowledge about gestational diabetes which led to the adoption of new diagnostic criterias and blood glucose target levels during pregnancy by the Swiss Society for Endocrinology and Diabetes are reviewed. The 6th International Workshop Conference on Gestational Diabetes mellitus in Pasedena (2008) defined new diagnostic criteria based on the results of the HAPO-Trial. These criteria were during the ADA congress in New Orleans in 2009 presented. According to the new criteria there is no need for screening, but all pregnant women have to be tested with a 75 g oral glucose tolerance test between the 24th and 28th week of pregnancy. The new diagnostic values are very similar to the ones previously adopted by the ADA with the exception that only one out of three values has to be elevated in order to make the diagnosis of gestational diabetes. Due to this important difference it is very likely that gestational diabetes will be diagnosed more frequently in the future. The diagnostic criteria are: Fasting plasma glucose > or = 5.1 mmol/l, 1-hour value > or = 10.0 mmol/l or 2-hour value > or = 8.5 mmol/l. Based on current knowledge and randomized trials it is much more difficult to define glucose target levels during pregnancy. This difficulty has led to many different recommendations issued by diabetes societies. The Swiss Society of Endocrinology and Diabetes follows the arguments of the International Diabetes Federation (IDF) that self-blood glucose monitoring itself lacks precision and that there are very few randomized trials. Therefore, the target levels have to be easy to remember and might be slightly different in mmol/l or mg/dl. The Swiss Society for Endocrinology and Diabetes adopts the tentative target values of the IDF with fasting plasma glucose values < 5.3 mM and 1- and 2-hour postprandial (after the end of the meal) values of < 8.0 and 7.0 mmol/l, respectively. The last part of these recommendations deals with the therapeutic options during pregnancy (nutrition, physical exercise and pharmaceutical treatment). If despite lifestyle changes the target values are not met, approximately 25 % of patients have to be treated pharmaceutically. Insulin therapy is still the preferred treatment option, but metformin (and as an exception glibenclamide) can be used, if there are major hurdles for the initiation of insulin therapy.
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
OBJECTIVES Application of the recently developed optical method based on the monitoring of the specular reflection intensity to study the protective potential of the salivary pellicle layer against early enamel erosion. METHODS The erosion progression was compared between two treatment groups: enamel samples coated by the 15 h-in vitro-formed salivary pellicle layer (group P, n=90) and the non-coated enamel surfaces (control group C, n=90). Different severity of the erosive impact was modelled by the enamel incubation in 1% citric acid (pH=3.6) for 2, 4, 8, 10 or 15 min. Erosion quantification was performed by the optical method as well as by the microhardness and calcium release analyses. RESULTS Optical assessment of the erosion progression showed erosion inhibition by the in vitro salivary pellicle in short term acidic treatments (≤ 4 min) which was also confirmed by microhardness measurements proving significantly less (p<0.05) enamel softening in the group P at 2 and 4 min of erosion compared to the group C. SEM images demonstrated less etched enamel interfaces in the group P at short erosion durations as well. CONCLUSIONS Monitoring of the specular reflection intensity can be successfully applied to quantify early erosion progression in comparative studies. In vitro salivary pellicle (2h) provides erosion inhibition but only in short term acidic exposures. CLINICAL SIGNIFICANCE The proposed optical technique is a promising tool for the fast and non-invasive erosion quantification in clinical studies.
Resumo:
Recently divergent species that can hybridize are ideal models for investigating the genetic exchanges that can occur while preserving the species boundaries. Petunia exserta is an endemic species from a very limited and specific area that grows exclusively in rocky shelters. These shaded spots are an inhospitable habitat for all other Petunia species, including the closely related and widely distributed species P. axillaris. Individuals with intermediate morphologic characteristics have been found near the rocky shelters and were believed to be putative hybrids between P. exserta and P. axillaris, suggesting a situation where Petunia exserta is losing its genetic identity. In the current study, we analyzed the plastid intergenic spacers trnS/trnG and trnH/psbA and six nuclear CAPS markers in a large sampling design of both species to understand the evolutionary process occurring in this biological system. Bayesian clustering methods, cpDNA haplotype networks, genetic diversity statistics, and coalescence-based analyses support a scenario where hybridization occurs while two genetic clusters corresponding to two species are maintained. Our results reinforce the importance of coupling differentially inherited markers with an extensive geographic sample to assess the evolutionary dynamics of recently diverged species that can hybridize. (C) 2013 Elsevier Inc. All rights reserved.
Resumo:
We investigated how processing fluency and defamiliarization (the art of rendering familiar notions unfamiliar) contribute to the affective and esthetic processing of reading in an event-related functional magnetic-resonance-imaging experiment.We compared the neural correlates of processing (a) familiar German proverbs, (b) unfamiliar proverbs, (c) defamiliarized variations with altered content relative to the original proverb (proverb-variants), (d) defamiliarized versions with unexpected wording but the same content as the original proverb (proverb-substitutions), and (e) non-rhetorical sentences. Here, we demonstrate that defamiliarization is an effectiveway of guiding attention, but that the degree of affective involvement depends on the type of defamiliarization: enhanced activation in affect-related regions (orbito-frontal cortex, medPFC) was found only if defamiliarization altered the content of the original proverb. Defamiliarization on the level of wording was associated with attention processes and error monitoring. Although proverb-variants evoked activation in affect-related regions, familiar proverbs received the highest beauty ratings.
Resumo:
New directly acting antivirals (DAAs) that inhibit hepatitis C virus (HCV) replication are increasingly used for the treatment of chronic hepatitis C. A marked pharmacokinetic variability and a high potential for drug-drug interactions between DAAs and numerous drug classes have been identified. In addition, ribavirin (RBV), commonly associated with hemolytic anemia, often requires dose adjustment, advocating for therapeutic drug monitoring (TDM) in patients under combined antiviral therapy. However, an assay for the simultaneous analysis of RBV and DAAs constitutes an analytical challenge because of the large differences in polarity among these drugs, ranging from hydrophilic (RBV) to highly lipophilic (telaprevir [TVR]). Moreover, TVR is characterized by erratic behavior on standard octadecyl-based reversed-phase column chromatography and must be separated from VRT-127394, its inactive C-21 epimer metabolite. We have developed a convenient assay employing simple plasma protein precipitation, followed by high-performance liquid chromatography coupled to tandem mass spectrometry (HPLC-MS/MS) for the simultaneous determination of levels of RBV, boceprevir, and TVR, as well as its metabolite VRT-127394, in plasma. This new, simple, rapid, and robust HPLC-MS/MS assay offers an efficient method of real-time TDM aimed at maximizing efficacy while minimizing the toxicity of antiviral therapy.
Resumo:
Dendrogeomorphology uses information sources recorded in the roots, trunks and branches of trees and bushes located in the fluvial system to complement (or sometimes even replace) systematic and palaeohydrological records of past floods. The application of dendrogeomorphic data sources and methods to palaeoflood analysis over nearly 40 years has allowed improvements to be made in frequency and magnitude estimations of past floods. Nevertheless, research carried out so far has shown that the dendrogeomorphic indicators traditionally used (mainly scar evidence), and their use to infer frequency and magnitude, have been restricted to a small, limited set of applications. New possibilities with enormous potential remain unexplored. New insights in future research of palaeoflood frequency and magnitude using dendrogeomorphic data sources should: (1) test the application of isotopic indicators (16O/18O ratio) to discover the meteorological origin of past floods; (2) use different dendrogeomorphic indicators to estimate peak flows with 2D (and 3D) hydraulic models and study how they relate to other palaeostage indicators; (3) investigate improved calibration of 2D hydraulic model parameters (roughness); and (4) apply statistics-based cost–benefit analysis to select optimal mitigation measures. This paper presents an overview of these innovative methodologies, with a focus on their capabilities and limitations in the reconstruction of recent floods and palaeofloods.
Resumo:
This article contributes to an ongoing debate about how to measure sensitive topics in population surveys. We propose a novel technique that can be applied to the measurement of quantitative sensitive variables: the item sum technique (IST). This method is closely related to the item count technique, which was developed for the measurement of dichotomous sensitive items. First, we provide a description of our new technique and discuss how data collected by the IST can be analyzed. Second, we present the results of a CATI survey on undeclared work in Germany, in which the IST has been applied. Using an experimental design, we compare the IST to direct questioning. Our empirical results indicate that the IST is a promising data-collection technique for sensitive questions. We conclude by discussing the limitations of the new technique and outlining possible improvements for future studies.