963 resultados para INTERVAL METHOD
Resumo:
To elucidate the mechanisms of antischistosoma resistance, drug-resistant Schistosoma mansoni laboratory isolates are essential. We developed a new method for inducing resistance to praziquantel (PZQ) using successive drug treatments of Biomphalaria glabrata snails infected with S. mansoni. Infected B. glabrata were treated three times with 100 mg/kg PZQ for five consecutive days with a one-week interval between them. After the treatment, the cercariae (LE-PZQ) produced from these snails and the LE strains (susceptible) were used to infect mice. Forty-five days after infection, mice were treated with 200, 400 or 800 mg/kg PZQ. Thirty days post-treatment, we observed that the mean number of worms recovered by perfusion was significantly higher in the group of mice infected with the LE-PZQ isolate treated with 200 and 400 mg/kg in comparison to the LE strain with the same treatment. Moreover, there was a significant difference between the ED50 (effective dose required to kill 50% of the worms) of the LE-PZQ isolate (362 mg/kg) and the LE strain (68 mg/kg). In the in vitro assays, the worms of the LE-PZQ isolate were also less susceptible to PZQ. Thus, the use of infected snails as an experimental model for development of resistance to S. mansoni is effective, fast, simple and cheap.
Resumo:
The recommended treatment for latent tuberculosis (TB) infection in adults is a daily dose of isoniazid (INH) 300 mg for six months. In Brazil, INH was formulated as 100 mg tablets. The treatment duration and the high pill burden compromised patient adherence to the treatment. The Brazilian National Programme for Tuberculosis requested a new 300 mg INH formulation. The aim of our study was to compare the bioavailability of the new INH 300 mg formulation and three 100 mg tablets of the reference formulation. We conducted a randomised, single dose, open label, two-phase crossover bioequivalence study in 28 healthy human volunteers. The 90% confidence interval for the INH maximum concentration of drug observed in plasma and area under the plasma concentration vs. time curve from time zero to the last measurable concentration “time t” was 89.61-115.92 and 94.82-119.44, respectively. The main limitation of our study was that neither adherence nor the safety profile of multiple doses was evaluated. To determine the level of INH in human plasma, we developed and validated a sensitive, simple and rapid high-performance liquid chromatography-tandem mass spectrometry method. Our results showed that the new formulation was bioequivalent to the 100 mg reference product. This finding supports the use of a single 300 mg tablet daily strategy to treat latent TB. This new formulation may increase patients’ adherence to the treatment and quality of life.
Resumo:
Objectives. The goal of this study is to evaluate a T2-mapping sequence by: (i) measuring the reproducibility intra- and inter-observer variability in healthy volunteers in two separate scanning session with a T2 reference phantom; (2) measuring the mean T2 relaxation times by T2-mapping in infarcted myocardium in patients with subacute MI and compare it with patient's the gold standard X-ray coronary angiography and healthy volunteers results. Background. Myocardial edema is a consequence of an inflammation of the tissue, as seen in myocardial infarct (MI). It can be visualized by cardiovascular magnetic resonance (CMR) imaging using the T2 relaxation time. T2-mapping is a quantitative methodology that has the potential to address the limitation of the conventional T2-weighted (T2W) imaging. Methods. The T2-mapping protocol used for all MRI scans consisted in a radial gradient echo acquisition with a lung-liver navigator for free-breathing acquisition and affine image registration. Mid-basal short axis slices were acquired.T2-maps analyses: 2 observers semi- automatically segmented the left ventricle in 6 segments accordingly to the AHA standards. 8 healthy volunteers (age: 27 ± 4 years; 62.5% male) were scanned in 2 separate sessions. 17 patients (age : 61.9 ± 13.9 years; 82.4% male) with subacute STEMI (70.6%) and NSTEMI underwent a T2-mapping scanning session. Results. In healthy volunteers, the mean inter- and intra-observer variability over the entire short axis slice (segment 1 to 6) was 0.1 ms (95% confidence interval (CI): -0.4 to 0.5, p = 0.62) and 0.2 ms (95% CI: -2.8 to 3.2, p = 0.94, respectively. T2 relaxation time measurements with and without the correction of the phantom yielded an average difference of 3.0 ± 1.1 % and 3.1 ± 2.1 % (p = 0.828), respectively. In patients, the inter-observer variability in the entire short axis slice (S1-S6), was 0.3 ms (95% CI: -1.8 to 2.4, p = 0.85). Edema location as determined through the T2-mapping and the coronary artery occlusion as determined on X-ray coronary angiography correlated in 78.6%, but only in 60% in apical infarcts. All except one of the maximal T2 values in infarct patients were greater than the upper limit of the 95% confidence interval for normal myocardium. Conclusions. The T2-mapping methodology is accurate in detecting infarcted, i.e. edematous tissue in patients with subacute infarcts. This study further demonstrated that this T2-mapping technique is reproducible and robust enough to be used on a segmental basis for edema detection without the need of a phantom to yield a T2 correction factor. This new quantitative T2-mapping technique is promising and is likely to allow for serial follow-up studies in patients to improve our knowledge on infarct pathophysiology, on infarct healing, and for the assessment of novel treatment strategies for acute infarctions.
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
A model-based approach for fault diagnosis is proposed, where the fault detection is based on checking the consistencyof the Analytical Redundancy Relations (ARRs) using an interval tool. The tool takes into account the uncertainty in theparameters and the measurements using intervals. Faults are explicitly included in the model, which allows for the exploitation of additional information. This information is obtained from partial derivatives computed from the ARRs. The signs in the residuals are used to prune the candidate space when performing the fault diagnosis task. The method is illustrated using a two-tank example, in which these aspects are shown to have an impact on the diagnosis and fault discrimination, since the proposed method goes beyond the structural methods
Resumo:
In forensic science, there is a strong interest in determining the post-mortem interval (PMI) of human skeletal remains up to 50 years after death. Currently, there are no reliable methods to resolve PMI, the determination of which relies almost exclusively on the experience of the investigating expert. Here we measured (90)Sr and (210)Pb ((210)Po) incorporated into bones through a biogenic process as indicators of the time elapsed since death. We hypothesised that the activity of radionuclides incorporated into trabecular bone will more accurately match the activity in the environment and the food chain at the time of death than the activity in cortical bone because of a higher remodelling rate. We found that determining (90)Sr can yield reliable PMI estimates as long as a calibration curve exists for (90)Sr covering the studied area and the last 50 years. We also found that adding the activity of (210)Po, a proxy for naturally occurring (210)Pb incorporated through ingestion, to the (90)Sr dating increases the reliability of the PMI value. Our results also show that trabecular bone is subject to both (90)Sr and (210)Po diagenesis. Accordingly, we used a solubility profile method to determine the biogenic radionuclide only, and we are proposing a new method of bone decontamination to be used prior to (90)Sr and (210)Pb dating.
Resumo:
Background: Publications from the International Breast Screening Network (IBSN) have shown that varying definitions create hurdles for comparison of screening performance. Interval breast cancer rates are particularly affected. Objective: to test whether variations in definition of interval cancer rates (ICR) affect comparisons of international ICR, specific to a comparison of ICR in Norway and North Carolina (NC). Methods: An interval cancer (IC) was defined as a cancer diagnosed following a negative screening mammogram in a defined follow-up period. ICR was calculated for women ages 50-69, at subsequent screening in Norway and NC, during the time period 1996 - 2002. ICR was defined using three different denominators (negative screens, negative final assessments and all screens) and three different numerators (DCIS, invasive cancer and all cancers). ICR was then calculated with two methods: 1) number of ICs divided by the number of screens, and ICs divided by the number of women-years at risk for IC. Results: There were no differences in ICR depending on the definition used. In the 1-12 month follow up period ICR (based on number of screens) were: 0.53, 0.54, and 0.54 for Norway; and 1.20, 1.25 and 1.17 for NC, for negative screens, negative final assessment and all screens, respectively: The same trend was seen for 13-24 and 1-24 months follow-up. Using women-years for the analysis did not change the trend. ICR was higher in NC compared to Norway under all definitions and in all follow-up time periods, regardless of calculation method. Conclusion: The ICR within or between Norway and NC did not differ by definition used. ICR were higher in NC than Norway. There are many potential explanations for the difference.
Resumo:
Ambulatory blood pressure monitoring (ABPM) has become indispensable for the diagnosis and control of hypertension. However, no consensus exists on how daytime and nighttime periods should be defined. OBJECTIVE: To compare daytime and nighttime blood pressure (BP) defined by an actigraph and by body position with BP resulting from arbitrary daytime and nighttime periods. PATIENTS AND METHOD: ABPM, sleeping periods and body position were recorded simultaneously using an actigraph (SenseWear Armband(®)) in patients referred for ABPM. BP results obtained with the actigraph (sleep and position) were compared to the results obtained with fixed daytime (7a.m.-10p.m.) and nighttime (10p.m.-7a.m.) periods. RESULTS: Data from 103 participants were available. More than half of them were taking antihypertensive drugs. Nocturnal BP was lower (systolic BP: 2.08±4.50mmHg; diastolic BP: 1.84±2.99mmHg, P<0.05) and dipping was more marked (systolic BP: 1.54±3.76%; diastolic BP: 2.27±3.48%, P<0.05) when nighttime was defined with the actigraph. Standing BP was higher (systolic BP 1.07±2.81mmHg; diastolic BP: 1.34±2.50mmHg) than daytime BP defined by a fixed period. CONCLUSION: Diurnal BP, nocturnal BP and dipping are influenced by the definition of daytime and nighttime periods. Studies evaluating the prognostic value of each method are needed to clarify which definition should be used.
Resumo:
The kernel of the cutia nut (castanha-de-cutia, Couepia edulis (Prance) Prance) of the western Amazon, which is consumed by the local population, has traditionally been extracted from the nut with a machete, a dangerous procedure that only produces kernels cut in half. A shelling off machine prototype, which produces whole kernels without serious risks to its operator, is described and tested. The machine makes a circular cut in the central part of the fruit shell, perpendicular to its main axis. Three ways of conditioning the fruits before cutting were compared: (1) control; (2) oven drying immediately prior to cutting; (3) oven drying, followed by a 24-hour interval before cutting. The time needed to extract and separate the kernel from the endocarp and testa was measured. Treatment 3 produced the highest output: 63 kernels per hour, the highest percentage of whole kernels (90%), and the best kernel taste. Kernel extraction with treatment 3 required 50% less time than treatment 1, while treatment 2 needed 38% less time than treatment 1. The proportion of kernels attached to the testa was 93%, 47%, and 8% for treatments 1, 2, and 3, respectively, and was the main reason for extraction time differences.
Resumo:
Abstract: Fifty-five bursa of Fabricius (BF) were evaluated by optical microscopy for three different avian histopathologists (H1, H3 and H4) to determine the degree of lymphoid depletion. One histologist evaluated the same slides at two different times (H1 and H2) with four-months interval between the observations. The same BFs were evaluated using the system of Digital Lymphocyte Depletion Evaluation (ADDL), being performed by three differents operators of the system, not histopathologists. The results showed was a significant difference between the histopathologists and between the scores established by the same expert (H1 and H2). However, there were not significant differences between the scores with the ADDL system, obtained using ADDL. The results make clear the fragility of the subjective lymphocyte depletion score classification by the traditional histologic method, while the ADDL system proves to be more appropriated for the assessment of the lymphoid loss in the BF.
Resumo:
Permanent bilateral occlusion of the common carotid arteries (2VO) in the rat has been established as a valid experimental model to investigate the effects of chronic cerebral hypoperfusion on cognitive function and neurodegenerative processes. Our aim was to compare the cognitive and morphological outcomes following the standard 2VO procedure, in which there is concomitant artery ligation, with those of a modified protocol, with a 1-week interval between artery occlusions to avoid an abrupt reduction of cerebral blood flow, as assessed by animal performance in the water maze and damage extension to the hippocampus and striatum. Male Wistar rats (N = 47) aged 3 months were subjected to chronic hypoperfusion by permanent bilateral ligation of the common carotid arteries using either the standard or the modified protocol, with the right carotid being the first to be occluded. Three months after the surgical procedure, rat performance in the water maze was assessed to investigate long-term effects on spatial learning and memory and their brains were processed in order to estimate hippocampal volume and striatal area. Both groups of hypoperfused rats showed deficits in reference (F(8,172) = 7.0951, P < 0.00001) and working spatial memory [2nd (F(2,44) = 7.6884, P < 0.001), 3rd (F(2,44) = 21.481, P < 0.00001) and 4th trials (F(2,44) = 28.620, P < 0.0001)]; however, no evidence of tissue atrophy was found in the brain structures studied. Despite similar behavioral and morphological outcomes, the rats submitted to the modified protocol showed a significant increase in survival rate, during the 3 months of the experiment (P < 0.02).
Resumo:
Energy drinks are becoming popular in Brazil and in the world due to their stimulant properties. Caffeine is present in energy drinks with the aim of stimulating the central nervous system and intensifying brain activity. On the other hand, the ingestion of high doses of caffeine can cause undesirable symptoms such as anxiety and tachycardia. Therefore, it is necessary to monitor the caffeine content added to energy drinks to guarantee that the levels in the final product are in accordance with the labeling and within the legislation limits. The goal of this work was to validate a fast, efficient, and low-cost method for the determination of caffeine in energy drinks by micellar electrokinetic chromatography (MEKC). A total of seven brands were analyzed, each in three lots. The electrolyte was prepared with 50 mmol.L-1 of sodium dodecyl sulfate (SDS) and 10 mmol.L-1 of sodium carbonate (pH 11.0). The mean concentration of caffeine ranged from 122.8 to 318.6 mg.L-1. None of the brands had caffeine levels above the maximum limit. Considering the interval of confidence (95%), 72% of the samples had less caffeine than the amount informed on the product label.
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.
Resumo:
It has been generally accepted that the method of moments (MoM) variogram, which has been widely applied in soil science, requires about 100 sites at an appropriate interval apart to describe the variation adequately. This sample size is often larger than can be afforded for soil surveys of agricultural fields or contaminated sites. Furthermore, it might be a much larger sample size than is needed where the scale of variation is large. A possible alternative in such situations is the residual maximum likelihood (REML) variogram because fewer data appear to be required. The REML method is parametric and is considered reliable where there is trend in the data because it is based on generalized increments that filter trend out and only the covariance parameters are estimated. Previous research has suggested that fewer data are needed to compute a reliable variogram using a maximum likelihood approach such as REML, however, the results can vary according to the nature of the spatial variation. There remain issues to examine: how many fewer data can be used, how should the sampling sites be distributed over the site of interest, and how do different degrees of spatial variation affect the data requirements? The soil of four field sites of different size, physiography, parent material and soil type was sampled intensively, and MoM and REML variograms were calculated for clay content. The data were then sub-sampled to give different sample sizes and distributions of sites and the variograms were computed again. The model parameters for the sets of variograms for each site were used for cross-validation. Predictions based on REML variograms were generally more accurate than those from MoM variograms with fewer than 100 sampling sites. A sample size of around 50 sites at an appropriate distance apart, possibly determined from variograms of ancillary data, appears adequate to compute REML variograms for kriging soil properties for precision agriculture and contaminated sites. (C) 2007 Elsevier B.V. All rights reserved.