916 resultados para Intensity and percentage of infestation
Resumo:
The clinical validity of at-risk criteria of psychosis had been questioned based on epidemiological studies that have reported much higher prevalence and annual incidence rates of psychotic-like experiences (PLEs as assessed by either self rating questionnaires or layperson interviews) in the general population than of the clinical phenotype of psychotic disorders (van Os et al., 2009). Thus, it is unclear whether “current at-risk criteria reflect behaviors so common among adolescents and young adults that a valid distinction between ill and non-ill persons is difficult” (Carpenter, 2009). We therefore assessed the 3-month prevalence of at-risk criteria by means of telephone interviews in a randomly drawn general population sample from the at-risk age segment (age 16–35 years) in the Canton Bern, Switzerland. Eighty-five of 102 subjects had valid phone numbers, 21 of these subjects refused (although 6 of them signaled willingness to participate at a later time), 4 could not be contacted. Sixty subjects (71% of the enrollment fraction) participated. Two participants met exclusion criteria (one for being psychotic, one for lack of language skills). Twenty-two at-risk symptoms were assessed for their prevalence and severity within the 3 months prior to the interview by trained clinical raters using (i) the Structured Interview for Prodromal Syndromes (SIPS; Miller et al., 2002) for the evaluation of 5 attenuated psychotic and 3 brief limited intermittent psychotic symptoms (APS, BLIPS) as well as state-trait criteria of the ultra-high-risk (UHR) criteria and (ii) the Schizophrenia Proneness Instrument, Adult version (SPI-A; Schultze-Lutter et al., 2007) for the evaluation of the 14 basic symptoms included in COPER and COGDIS (Schultze-Lutter et al., 2008). Further, psychiatric axis I diagnoses were assessed by means of the Mini-International Neuropsychiatric Interview, M.I.N.I. (Sheehan et al., 1998), and psychosocial functioning by the Scale of Occupational and Functional Assessment (SOFAS; APA, 1994). All interviewees felt ‘rather’ or ‘very’ comfortable with the interview. Of the 58 included subjects, only 1 (2%) fulfilled APS criteria by reporting the attenuated, non-delusional idea of his mind being literally read by others at a frequency of 2–3 times a week that had newly occurred 6 weeks ago. BLIPS, COPER, COGDIS or state-trait UHR criteria were not reported. Yet, twelve subjects (21%) described sub-threshold at-risk symptoms: 7 (12%) reported APS relevant symptoms but did not meet time/frequency criteria of APS, and 9 (16%) reported COPER and/or COGDIS relevant basic symptoms but at an insufficient frequency or as a trait lacking increase in severity; 4 of these 12 subjects reported both sub-threshold APS and sub-threshold basic symptoms. Table 1 displays type and frequency of the sub-threshold at-risk symptoms.
Resumo:
Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have shown that a patient's antibody reaction in a confirmatory line immunoassay (INNO-LIATM HIV I/II Score, Innogenetics) provides information on the duration of infection. Here, we sought to further investigate the diagnostic specificity of various Inno-Lia algorithms and to identify factors affecting it. Methods Plasma samples of 714 selected patients of the Swiss HIV Cohort Study infected for longer than 12 months and representing all viral clades and stages of chronic HIV-1 infection were tested blindly by Inno-Lia and classified as either incident (up to 12 m) or older infection by 24 different algorithms. Of the total, 524 patients received HAART, 308 had HIV-1 RNA below 50 copies/mL, and 620 were infected by a HIV-1 non-B clade. Using logistic regression analysis we evaluated factors that might affect the specificity of these algorithms. Results HIV-1 RNA <50 copies/mL was associated with significantly lower reactivity to all five HIV-1 antigens of the Inno-Lia and impaired specificity of most algorithms. Among 412 patients either untreated or with HIV-1 RNA ≥50 copies/mL despite HAART, the median specificity of the algorithms was 96.5% (range 92.0-100%). The only factor that significantly promoted false-incident results in this group was age, with false-incident results increasing by a few percent per additional year. HIV-1 clade, HIV-1 RNA, CD4 percentage, sex, disease stage, and testing modalities exhibited no significance. Results were similar among 190 untreated patients. Conclusions The specificity of most Inno-Lia algorithms was high and not affected by HIV-1 variability, advanced disease and other factors promoting false-recent results in other STARHS. Specificity should be good in any group of untreated HIV-1 patients.
Resumo:
Background Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study. Methodology/Principal Findings We built up two prediction rules (“Snap-shot rule” for a single sample and “Track-shot rule” for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior ≥5% or <5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200×106/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold. Conclusions/Significance Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count >650 for a threshold of 200, >900 for 350, or >1150 for 500×106/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.
Resumo:
Helicopter emergency medical services (HEMSs) have become a standard element of modern prehospital emergency medicine. This study determines the percentage of injured HEMS patients whose injuries were correctly recognized by HEMS physicians.
Resumo:
Objectives To assess the proportion of patients lost to programme (died, lost to follow-up, transferred out) between HIV diagnosis and start of antiretroviral therapy (ART) in sub-Saharan Africa, and determine factors associated with loss to programme. Methods Systematic review and meta-analysis. We searched PubMed and EMBASE databases for studies in adults. Outcomes were the percentage of patients dying before starting ART, the percentage lost to follow-up, the percentage with a CD4 cell count, the distribution of first CD4 counts and the percentage of eligible patients starting ART. Data were combined using random-effects meta-analysis. Results Twenty-nine studies from sub-Saharan Africa including 148 912 patients were analysed. Six studies covered the whole period from HIV diagnosis to ART start. Meta-analysis of these studies showed that of the 100 patients with a positive HIV test, 72 (95% CI 60-84) had a CD4 cell count measured, 40 (95% CI 26-55) were eligible for ART and 25 (95% CI 13-37) started ART. There was substantial heterogeneity between studies (P < 0.0001). Median CD4 cell count at presentation ranged from 154 to 274 cells/μl. Patients eligible for ART were less likely to become lost to programme (25%vs. 54%, P < 0.0001), but eligible patients were more likely to die (11%vs. 5%, P < 0.0001) than ineligible patients. Loss to programme was higher in men, in patients with low CD4 cell counts and low socio-economic status and in recent time periods. Conclusions Monitoring and care in the pre-ART time period need improvement, with greater emphasis on patients not yet eligible for ART.
Resumo:
Data on antimicrobial use play a key role in the development of policies for the containment of antimicrobial resistance. On-farm data could provide a detailed overview of the antimicrobial use, but technical and methodological aspects of data collection and interpretation, as well as data quality need to be further assessed. The aims of this study were (1) to quantify antimicrobial use in the study population using different units of measurement and contrast the results obtained, (2) to evaluate data quality of farm records on antimicrobial use, and (3) to compare data quality of different recording systems. During 1 year, data on antimicrobial use were collected from 97 dairy farms. Antimicrobial consumption was quantified using: (1) the incidence density of antimicrobial treatments; (2) the weight of active substance; (3) the used daily dose and (4) the used course dose for antimicrobials for intestinal, intrauterine and systemic use; and (5) the used unit dose, for antimicrobials for intramammary use. Data quality was evaluated by describing completeness and accuracy of the recorded information, and by comparing farmers' and veterinarians' records. Relative consumption of antimicrobials depended on the unit of measurement: used doses reflected the treatment intensity better than weight of active substance. The use of antimicrobials classified as high priority was low, although under- and overdosing were frequently observed. Electronic recording systems allowed better traceability of the animals treated. Recording drug name or dosage often resulted in incomplete or inaccurate information. Veterinarians tended to record more drugs than farmers. The integration of veterinarian and farm data would improve data quality.
Resumo:
We examined the impact of physical activity (PA) on surrogate markers of cardiovascular health in adolescents. 52 healthy students (28 females, mean age 14.5 ± 0.7 years) were investigated. Microvascular endothelial function was assessed by peripheral arterial tonometry to determine reactive hyperemic index (RHI). Vagal activity was measured using 24 h analysis of heart rate variability [root mean square of successive normal-to-normal intervals (rMSSD)]. Exercise testing was performed to determine peak oxygen uptake ([Formula: see text]) and maximum power output. PA was assessed by accelerometry. Linear regression models were performed and adjusted for age, sex, skinfolds, and pubertal status. The cohort was dichotomized into two equally sized activity groups (low vs. high) based on the daily time spent in moderate-to-vigorous PA (MVPA, 3,000-5,200 counts(.)min(-1), model 1) and vigorous PA (VPA, >5,200 counts(.)min(-1), model 2). MVPA was an independent predictor for rMSSD (β = 0.448, P = 0.010), and VPA was associated with maximum power output (β = 0.248, P = 0.016). In model 1, the high MVPA group exhibited a higher vagal tone (rMSSD 49.2 ± 13.6 vs. 38.1 ± 11.7 ms, P = 0.006) and a lower systolic blood pressure (107.3 ± 9.9 vs. 112.9 ± 8.1 mmHg, P = 0.046). In model 2, the high VPA group had higher maximum power output values (3.9 ± 0.5 vs. 3.4 ± 0.5 W kg(-1), P = 0.012). In both models, no significant differences were observed for RHI and [Formula: see text]. In conclusion, in healthy adolescents, PA was associated with beneficial intensity-dependent effects on vagal tone, systolic blood pressure, and exercise capacity, but not on microvascular endothelial function.
Resumo:
A confocal imaging and image processing scheme is introduced to visualize and evaluate the spatial distribution of spectral information in tissue. The image data are recorded using a confocal laser-scanning microscope equipped with a detection unit that provides high spectral resolution. The processing scheme is based on spectral data, is less error-prone than intensity-based visualization and evaluation methods, and provides quantitative information on the composition of the sample. The method is tested and validated in the context of the development of dermal drug delivery systems, introducing a quantitative uptake indicator to compare the performances of different delivery systems is introduced. A drug penetration study was performed in vitro. The results show that the method is able to detect, visualize and measure spectral information in tissue. In the penetration study, uptake efficiencies of different experiment setups could be discriminated and quantitatively described. The developed uptake indicator is a step towards a quantitative assessment and, in a more general view apart from pharmaceutical research, provides valuable information on tissue composition. It can potentially be used for clinical in vitro and in vivo applications.
Resumo:
This study evaluated the relationship between recalled parental treatment, attachment style, and coping with parental and romantic stressors. A group of 66 undergraduate students completed the Parental Bonding Instrument (PBI) (Parker, Tupling, & Brown, 1979), a measure of attachment style (Simpson, 1990), general questions regarding the intensity and frequency of parental and romantic stressors, and their typical ways of coping with each type (Vitaliano, Russo, Carr, Maiuro, & Becker, 1985). Data analysis showed that attachment scores were significantly correlated with coping with both kinds of stress. The most significant correlations were found between attachment and coping with romantic stressors. Overall, high or low use of a specific approach to coping was consistent in the face of parental and romantic stressors. Further, exploratory analysis revealed that the habitual intensity of the experienced stressors could act as a moderator of coping techniques.
Resumo:
The aim of this study was to evaluate serum and peritoneal fluid (PF) glycodelin-A concentrations in women with ovarian endometriosis. Ninety-nine matched pairs of serum and PF samples were included in our study. The case group comprised 57 women with ovarian endometriosis and the control group 42 healthy women undergoing sterilization or patients with benign ovarian cysts. Glycodelin-A concentrations were measured using ELISA. Endometriosis patients had significantly higher serum and PF glycodelin-A concentrations compared to controls, and this increase was observed in both proliferative and secretory cycle phases. Glycodelin-A concentrations were more than 10-fold higher in PF than in serum and correlated with each other. Intensity and frequency of menstrual pain positively correlated with glycodelin-A concentrations. Sensitivity and specificity of glycodelin-A as a biomarker for ovarian endometriosis were 82.1% and 78.4% in serum, and 79.7% and 77.5% in PF, respectively. These results indicate that Glycodelin-A has a potential role as a biomarker to be used in combination with other, independent marker molecules.
Resumo:
Rates of protein synthesis (PS) and turnover are more rapid during the neonatal period than during any other stage of postnatal life. Vitamin A and lactoferrin (Lf) can stimulate PS in neonates. However, newborn calves are vitamin A deficient and have a low Lf status, but plasma vitamin A and Lf levels increase rapidly after ingestion of colostrum. Neonatal calves (n = 6 per group) were fed colostrum or a milk-based formula without or with vitamin A, Lf, or vitamin A plus Lf to study PS in the jejunum and liver. l-[(13)C]Valine was intravenously administered to determine isotopic enrichment of free (nonprotein-bound) Val (AP(Free)) in the protein precursor pool, atom percentage excess (APE) of protein-bound Val, fractional protein synthesis rate (FSR) in the jejunum and liver, and isotopic enrichment of Val in plasma (APE(Pla)) and in the CO(2) of exhaled air (APE(Ex)). The APE, AP(Free), and FSR in the jejunum and liver did not differ significantly among groups. The APE(Ex) increased, whereas APE(Pla) decreased over time, but there were no group differences. Correlations were calculated between FSR(Jej) and histomorphometrical and histochemical data of the jejunum, and between FSR(Liv) and blood metabolites. There were negative correlations between FSR(Liv) and plasma albumin concentrations and between FSR(Jej) and the ratio of villus height:crypt depth, and there was a positive correlation between FSR(Jej) and small intestinal cell proliferation in crypts. Hence, there were no effects of vitamin A and Lf and no interactions between vitamin A and Lf on intestinal and hepatic PS. However, FSR(Jej) was correlated with histomorphometrical traits of the jejunum and FSR(Liv) was correlated with plasma albumin concentrations.
Resumo:
BACKGROUND AND PURPOSE: Daily use of conventional electronic portal imaging devices (EPID) for organ tracking is limited due to the relatively high dose required for high quality image acquisition. We studied the use of a novel dose saving acquisition mode (RadMode) allowing to take images with one monitor unit per image in prostate cancer patients undergoing intensity-modulated radiotherapy (IMRT) and tracking of implanted fiducial gold markers. PATIENTS AND METHODS: Twenty five patients underwent implantation of three fiducial gold markers prior to the planning CT. Before each treatment of a course of 37 fractions, orthogonal localization images from the antero-posterior and from the lateral direction were acquired. Portal images of both the setup procedure and the five IMRT treatment beams were analyzed. RESULTS: On average, four localization images were needed for a correct patient setup, resulting in four monitor units extra dose per fraction. The mean extra dose delivered to the patient was thereby increased by 1.2%. The procedure was precise enough to reduce the mean displacements prior to treatment to < o =0.3 mm. CONCLUSIONS: The use of a new dose saving acquisition mode enables to perform daily EPID-based prostate tracking with a cumulative extra dose of below 1 Gy. This concept is efficiently used in IMRT-treated patients, where separation of setup beams from treatment beams is mandatory.
Resumo:
BACKGROUND: Fever in neutropenia (FN) is a frequent complication in pediatric oncology. Deficiency of mannose-binding lectin (MBL), an important component of innate immunity, is common due to genetic polymorphisms, but its impact on infections in oncologic patients is controversial. This study investigated whether MBL serum levels at cancer diagnosis are associated with the development of FN in pediatric cancer patients. PROCEDURE: Serum MBL was measured using ELISA. Frequency, duration, and cause of FN were assessed retrospectively. Association with MBL level was analyzed using uni- and multivariate Poisson regression taking into account both intensity and duration of chemotherapy. RESULTS: In 94 children, with a cumulative follow-up time of 81.7 years, 177 FN episodes were recorded. Patients with both very low MBL levels (<100 microg/L; risk ratio (RR), 1.93; 95% CI, 1.14-3.28; P = 0.014) and normal MBL levels (>/=1,000 microg/L; RR, P = 0.011) had significantly more frequent FN episodes than patients with low MBL levels (100-999 microg/L). Patients with very low MBL levels had significantly more episodes of FN with severe bacterial infection (bacteremia or pneumonia; RR, 4.49; 1.69 = 11.8; P = 0.003), while those with normal MBL levels had more FN episodes with no microbial etiology identified (RR, 1.85; 1.14 = 3.03; P = 0.014). CONCLUSIONS: Very low MBL levels are associated with more frequent FN episodes, mainly due to severe bacterial infections. The surprising finding that children with normal MBL levels had more frequent FN episodes than those with low MBL levels needs testing in prospective studies. Pediatr Blood Cancer (c) 2006 Wiley-Liss, Inc.
Resumo:
Gastrin-releasing peptide receptors (GRP-R) are upregulated in many cancers, including prostate, breast, and lung. We describe a new radiolabeled bombesin (BBN) analog for imaging and systemic radiotherapy that has improved pharmacokinetics (PK) and better retention of radioactivity in the tumor. METHODS: DO3A-CH2CO-G-4-aminobenzoyl-Q-W-A-V-G-H-L-M-NH2 (AMBA) was synthesized and radiolabeled. The human prostate cancer cell line PC-3 was used to determine the binding (Kd), retention, and efflux of 177Lu-AMBA. Receptor specificity was determined by in vitro autoradiography in human tissues. PK and radiotherapy studies were performed in PC-3 tumor-bearing male nude mice. RESULTS: 177Lu-AMBA has a high affinity for the GRP-R (Kd, 1.02 nmol/L), with a maximum binding capacity (Bmax) of 414 fmol/10(6) cells (2.5 x 10(5) GRP-R/cell). Internalization was similar for 177Lu-AMBA (76.8%), 177Lu-BBN8 (72.9%), and 125I-[Tyr4]-BBN (74.9%). Efflux was markedly lower for 177Lu-AMBA (2.9%) compared with 177Lu-BBN8 (15.9%) and 125I-[Tyr4]-BBN (46.1%). By receptor autoradiography, Lu-AMBA binds specifically to GRP-R (0.8 nmol/L) and to the neuromedin B receptor (NMB-R) (0.9 nmol/L), with no affinity for the bb3 receptor (>1,000 nmol/L). 177Lu-AMBA was renally excreted (55 %ID 1 h [percentage injected dose at 1 h]); tumor uptake at 1 and 24 h was 6.35 %ID/g and 3.39 %ID/g, respectively. One or 2 doses of 177Lu-AMBA (27.75 MBq/dose) significantly prolonged the life span of PC-3 tumor-bearing mice (P < 0.001 and P < 0.0001, respectively) and decreased PC-3 tumor growth rate over controls. When compared using World Health Organization criteria, mice receiving 2 doses versus 1 dose of 177Lu-AMBA demonstrated a shift away from stable/progressive disease toward complete/partial response; by RECIST (Response Evaluation Criteria in Solid Tumors), median survival increased by 36% and time to progression/progression-free survival increased by 65%. CONCLUSION: 177Lu-AMBA binds with nanomolar affinity to GRP-R and NMB-R, has low retention of radioactivity in kidney, demonstrates a very favorable risk-benefit profile, and is in phase I clinical trials.
Resumo:
The purpose of this study was to demonstrate the improvement in diagnostic quality and diagnostic accuracy of SonoVue microbubble contrast-enhanced ultrasound (CE-US) versus unenhanced ultrasound imaging during the investigation of extracranial carotid or peripheral arteries. 82 patients with suspected extracranial carotid or peripheral arterial disease received four SonoVue doses (0.3 ml, 0.6 ml, 1.2 ml and 2.4 ml) with Doppler ultrasound performed before and following each dose. Diagnostic quality of the CE-US examinations was evaluated off-site for duration of clinically useful contrast enhancement, artefact effects and percentage of examinations converted from non-diagnostic to diagnostic. Accuracy, sensitivity and specificity were assessed as agreement of CE-US diagnosis evaluated by an independent panel of experts with reference standard modality. The median duration of clinically useful signal enhancement significantly increased with increasing SonoVue doses (p< or =0.002). At the dose of 2.4 ml of SonoVue, diagnostic quality evaluated as number of inconclusive examinations significantly improved, falling from 40.7% at baseline down to 5.1%. Furthermore, SonoVue significantly (p<0.01) increased the accuracy, sensitivity and specificity of assessment of disease compared with baseline ultrasound. SonoVue increases the diagnostic quality of Doppler images and improves the accuracy of both spectral and colour Doppler examinations of extracranial carotid or peripheral arterial disease.