837 resultados para Interval time-varying delay
Resumo:
Tese de Doutoramento em Psicologia Básica
Resumo:
When speech is degraded, word report is higher for semantically coherent sentences (e.g., her new skirt was made of denim) than for anomalous sentences (e.g., her good slope was done in carrot). Such increased intelligibility is often described as resulting from "top-down" processes, reflecting an assumption that higher-level (semantic) neural processes support lower-level (perceptual) mechanisms. We used time-resolved sparse fMRI to test for top-down neural mechanisms, measuring activity while participants heard coherent and anomalous sentences presented in speech envelope/spectrum noise at varying signal-to-noise ratios (SNR). The timing of BOLD responses to more intelligible speech provides evidence of hierarchical organization, with earlier responses in peri-auditory regions of the posterior superior temporal gyrus than in more distant temporal and frontal regions. Despite Sentence content × SNR interactions in the superior temporal gyrus, prefrontal regions respond after auditory/perceptual regions. Although we cannot rule out top-down effects, this pattern is more compatible with a purely feedforward or bottom-up account, in which the results of lower-level perceptual processing are passed to inferior frontal regions. Behavioral and neural evidence that sentence content influences perception of degraded speech does not necessarily imply "top-down" neural processes.
Resumo:
Hexaflumuron, an insect growth regulator (IGR), was found to greatly affect the development of immatures and emergence of adults of three species of vector mosquitoes, Culex quinquefasciatus, Aedes aegypti and Anopheles stephensi, when larvae were subjected to short time exposure of < or = 1h. This IGR could completely prevent adult emergence even at a minimum exposure time of 10 min at 0.001, 0.01 and 0.1 mg/l. On treatment, larval and pupal mortality as well as varying degrees of morphogenetic abnormalities were induced in immatures and adults of the three species. Four weeks of control achieved in a slow moving sullage canal breeding Culex quinquefasciatus indicates that this IGR can be of use in such breeding habitats.
Resumo:
The purpose of this study was to determine the prognostic accuracy of perfusion computed tomography (CT), performed at the time of emergency room admission, in acute stroke patients. Accuracy was determined by comparison of perfusion CT with delayed magnetic resonance (MR) and by monitoring the evolution of each patient's clinical condition. Twenty-two acute stroke patients underwent perfusion CT covering four contiguous 10mm slices on admission, as well as delayed MR, performed after a median interval of 3 days after emergency room admission. Eight were treated with thrombolytic agents. Infarct size on the admission perfusion CT was compared with that on the delayed diffusion-weighted (DWI)-MR, chosen as the gold standard. Delayed magnetic resonance angiography and perfusion-weighted MR were used to detect recanalization. A potential recuperation ratio, defined as PRR = penumbra size/(penumbra size + infarct size) on the admission perfusion CT, was compared with the evolution in each patient's clinical condition, defined by the National Institutes of Health Stroke Scale (NIHSS). In the 8 cases with arterial recanalization, the size of the cerebral infarct on the delayed DWI-MR was larger than or equal to that of the infarct on the admission perfusion CT, but smaller than or equal to that of the ischemic lesion on the admission perfusion CT; and the observed improvement in the NIHSS correlated with the PRR (correlation coefficient = 0.833). In the 14 cases with persistent arterial occlusion, infarct size on the delayed DWI-MR correlated with ischemic lesion size on the admission perfusion CT (r = 0.958). In all 22 patients, the admission NIHSS correlated with the size of the ischemic area on the admission perfusion CT (r = 0.627). Based on these findings, we conclude that perfusion CT allows the accurate prediction of the final infarct size and the evaluation of clinical prognosis for acute stroke patients at the time of emergency evaluation. It may also provide information about the extent of the penumbra. Perfusion CT could therefore be a valuable tool in the early management of acute stroke patients.
Resumo:
There has been a long debate since the introduction of blood analysis prior to major sports events, to find out whether blood samples should be analysed right away on the site of competition or whether they should be transported and analysed in an anti-doping laboratory. Therefore, it was necessary to measure blood samples and compare the results obtained right after the blood withdrawal with those obtained after a few hours delay. Furthermore, it was interesting to determine the effect of temperature on the possible deterioration of red blood cell analytes used for testing recombinant erythropoietin abuse. Healthy volunteers were asked to give two blood samples and one of these was kept at room temperature whereas the second one was put into a refrigerator. On a regular basis, the samples were rolled for homogenisation and temperature stabilisation and were analysed with the same haematological apparatus. The results confirmed that blood controls prior to competition should be performed as soon as possible with standardised pre-analytical conditions to avoid too many variations notably on the haematocrit and the reticulocyte count. These recommendations should ideally also be applied to the all the blood controls compulsory for the medical follow up, otherwise unexplainable values could be misinterpreted and could for instance lead to a period of incapacity.
Resumo:
Background and Aims: Two distinct e ndoscopic phenotypes of E osinophilic Esophagitis (EoE) h ave been identified: t he inflammatory (IP) a nd the stenosing (SP) p henotype. I t is not known whether these EoE-associated phenotypes are reflective of different phases during disease course. We aimed to assess the phenotype a t initial EoE p resentation and d iagnosis and to evaluate if SP increases over time. Methods: R etrospective a nalysis of t he Swiss EoE Database (SEED) extended b y a review of p atients charts, endoscopy and pathology records. Results: F orty-four E oE p atients were a nalyzed (33 males, mean age at index visit 41 ± 14 years, all Caucasians). Median follow-up t ime was 3.1 years (IQR 1-4, r ange 1 -18 years). Median diagnostic delay w as 5 y ears (IQR 2-16, range 0-34 years). A t first diagnosis, 3 2% ( 14/44) o f EoE patients h ad already presented w ith a stenosis. T he mean d iameter o f the stenoses w as 1 0 ± 2 mm, and the mean length was 2 .8 ± 2 .9 cm. Peak e osinophil count d id n ot c hange over t ime (48 ± 39 eos/HPF at index visit vs. 59 ± 41 eos/HPF at end of follow-up, n=44). The risk of the presence of a stenosis at index visit was 0% f or a d isease duration of 0 -4 y ears, 37% f or a d isease duration between 5-10 years and 67% f or a d isease duration >10 years (p = 0.0035, trend test). Conclusions: T he frequency of e sophageal stenoses i s proportional to the disease duration, whereas the inflammatory activity does n ot s ignificantly c hange over t ime. O ur f indings underscore the necessity to reduce diagnostic delay in EoE and to control the underlying inflammatory processes to prevent esophageal remodeling.
Resumo:
A prospective cross-over study was performed in a general practice environment to assess and compare compliance data obtained by electronic monitoring on a BID or QD regimen in 113 patients with hypertension or angina pectoris. All patients were on a BID regimen (nifedipine SR) during the first month and switched to QD regimen (amlodipine) for another month. Taking compliance (i.e. the proportion of days with correct dosing) improved in 30% of patients (95% confidence interval 19 to 41%, p < 0.001), when switching from a BID to a QD regimen, but at the same time there was a 15% increase (95% confidence interval 5 to 25%, p < 0.02) in the number of patients with one or more no-dosing days. About 8% of patients had a low compliance rate, irrespective of the dosage regimen. Actual dosage intervals were used to estimate extent and timing of periods with unsatisfactory drug activity for various hypothetical drug durations of action, and it appears that the apparent advantage of QD regimen in terms of compliance is clinically meaningful only, when the duration of activity extents beyond the dosage interval in all patients.
Resumo:
The diagnosis of inflammatory bowel disease (IBD), comprising Crohn's disease (CD) and ulcerative colitis (UC), continues to present difficulties due to unspecific symptoms and limited test accuracies. We aimed to determine the diagnostic delay (time from first symptoms to IBD diagnosis) and to identify associated risk factors. A total of 1591 IBD patients (932 CD, 625 UC, 34 indeterminate colitis) from the Swiss IBD cohort study (SIBDCS) were evaluated. The SIBDCS collects data on a large sample of IBD patients from hospitals and private practice across Switzerland through physician and patient questionnaires. The primary outcome measure was diagnostic delay. Diagnostic delay in CD patients was significantly longer compared to UC patients (median 9 versus 4 months, P < 0.001). Seventy-five percent of CD patients were diagnosed within 24 months compared to 12 months for UC and 6 months for IC patients. Multivariate logistic regression identified age <40 years at diagnosis (odds ratio [OR] 2.15, P = 0.010) and ileal disease (OR 1.69, P = 0.025) as independent risk factors for long diagnostic delay in CD (>24 months). In UC patients, nonsteroidal antiinflammatory drug (NSAID intake (OR 1.75, P = 0.093) and male gender (OR 0.59, P = 0.079) were associated with long diagnostic delay (>12 months). Whereas the median delay for diagnosing CD, UC, and IC seems to be acceptable, there exists a long delay in a considerable proportion of CD patients. More public awareness work needs to be done in order to reduce patient and doctor delays in this target population.
Resumo:
Seven rhesus macaques were infected intradermally with 10(7) promastigotes of Leishmania (Leishmania) major. All monkeys developed a localized, ulcerative, self-healing nodular skin lesion at the site of inoculation of the parasite. Non-specific chronic inflammation and/or tuberculoid-type granulomatous reaction were the main histopathological manifestations of the disease. Serum Leishmania-specific antibodies (IgG and IgG1) were detected by ELISA in all infected animals; immunoblot analyses indicated that numerous antigens were recognized. A very high degree of variability was observed in the parasite-specific cell-mediated immune responses [as detected by measuring delayed-type hypersensitivity (DTH) reaction, in vitro lymphocyte proliferation, and gamma interferon (IFN-gamma) production] for individuals over time post challenge. From all the recovered monkeys (which showed resolution of the lesions after 11 weeks of infection), 57.2% (4/7) and 28.6% (2/7) animals remained susceptible to secondary and tertiary infections, respectively, but the disease severity was altered (i.e. lesion size was smaller and healed faster than in the primary infection). The remaining monkeys exhibited complete resistance (i.e. no lesion) to each rechallenge. Despite the inability to consistently detect correlates of cell-mediated immunity to Leishmania or correlation between resistance to challenge and DTH, lymphocyte transformation or IFN-gamma production, partial or complete acquired resistance was conferred by experimental infection. This primate model should be useful for measuring vaccine effectiveness against the human disease.
Resumo:
Aim: The diagnosis of inflammatory bowel disease (IBD), comprising Crohn's disease (CD) and ulcerative colitis (UC), continues to present difficulties due to unspecific symptoms and limited test accuracies. We aimed to determine the diagnostic delay (time from first symptoms to IBD diagnosis) and to identify associated risk factors in a national cohort in Switzerland.¦Materials and Methods: A total of 1,591 IBD patients (932 CD, 625 UC, 34 indeterminate colitis) from the Swiss IBD cohort study (SIBDCS) were evaluated. The SIBDCS collects data on a large sample of IBD patients from hospitals and private practice across Switzerland through physician and patient questionnaires. The primary outcome measure was the diagnostic delay.¦Results: Diagnostic delay in CD patients was significantly longer compared to UC patients (median 9 vs. 4 months, P < 0.001). Seventy-five percent of CD patients were diagnosed within 24 months compared to 12 months for UC and 6 months for IC patients. Multivariate logistic regression identified age <40 years at diagnosis (OR 2.15, P = 0.010) and ileal disease (OR 1.69, P = 0.025) as independent risk factors for long diagnostic delay in CD (>24 months). A trend for long diagnostic delay (>12 months) was associated with NSAID intake (OR 1.75, P = 0.093) and male gender (OR 0.59, P = 0.079) in UC patients.¦Conclusions: Whereas the median delay for diagnosing CD, UC, and IC seems to be acceptable, there exists a long delay in a considerable proportion of CD patients. More public awareness work needs to be done in order to reduce patient's and doctor's delay in this target population.
Resumo:
Schistosoma mansoni is responsible for lesions that can alter the hemodinamic of the portal venous circulation, lung arterial and venous sistemic systems. Therefore, hemodinamic changes in the ocular circulation of mansonic schistosomotic patients with portal hypertension and hepatofugal venous blood flow is also probable. The purpose of this study was to determine the fluorescein contrast arrival time at the retina of young patients with the hepatosplenic form of schistosomiasis, clinically and surgically treated. The control group included 36 non schistosomotic patients, mean age of 17.3 years, and the case group was represented by 25 schistosomotic patients, mean age of 18.2 years, who were cared for at The University Hospital (Federal University of Pernambuco, Brazil), from 1990 to 2001. They underwent digital angiofluoresceinography and were evaluated for the contrast arrival time at the early retinal venous phase of the exam. Both groups were ophthalmologically examined at the same hospital (Altino Ventura Foundation, Recife, Brazil), using the same technique. There was retardation of the retinal contrast arrival time equal or more than 70 sec in the eyes of three schistosomotic patients (12%) and in none of the control group, however, the mean contrast arrival time between the two groups were not statistically different. These findings lend support to the hypothesis that there could be a delay of the eye venous blood flow drainage.
Resumo:
BACKGROUND: The ideal local anesthetic regime for femoral nerve block that balances analgesia with mobility after total knee arthroplasty (TKA) remains undefined. QUESTIONS/PURPOSES: We compared two volumes and concentrations of a fixed dose of ropivacaine for continuous femoral nerve block after TKA to a single injection femoral nerve block with ropivacaine to determine (1) time to discharge readiness; (2) early pain scores and analgesic consumption; and (3) functional outcomes, including range of motion and WOMAC scores at the time of recovery. METHODS: Ninety-nine patients were allocated to one of three continuous femoral nerve block groups for this randomized, placebo-controlled, double-blind trial: a high concentration group (ropivacaine 0.2% infusion), a low concentration group (ropivacaine 0.1% infusion), or a placebo infusion group (saline 0.9% infusion). Infusions were discontinued on postoperative Day (POD) 2. The primary outcome was time to discharge readiness. Secondary outcomes included opioid consumption, pain, and functional outcomes. Ninety-three patients completed the study protocol; the study was halted early because of unanticipated changes to pain protocols at the host institution, by which time only 61% of the required number of patients had been enrolled. RESULTS: With the numbers available, the mean time to discharge readiness was not different between groups (high concentration group, 62 hours [95% confidence interval [CI], 51-72 hours]; low concentration group, 73 hours [95% CI, 63-83 hours]; placebo infusion group 65 hours [95% CI, 56-75 hours]; p = 0.27). Patients in the low concentration group consumed significantly less morphine during the period of infusion (POD 1, high concentration group, 56 mg [95% CI, 42-70 mg]; low concentration group, 35 mg [95% CI, 27-43 mg]; placebo infusion group, 48 mg [95% CI, 38-59 mg], p = 0.02; POD 2, high concentration group, 50 mg [95% CI, 41-60 mg]; low concentration group, 33 mg [95% CI, 24-42 mg]; placebo infusion group, 39 mg [95% CI, 30-48 mg], p = 0.04); however, there were no important differences in pain scores or opioid-related side effects with the numbers available. Likewise, there were no important differences in functional outcomes between groups. CONCLUSIONS: Based on this study, which was terminated prematurely before the desired sample size could be achieved, we were unable to demonstrate that varying the concentration and volume of a fixed-dose ropivacaine infusion for continuous femoral nerve block influences time to discharge readiness when compared with a conventional single-injection femoral nerve block after TKA. A low concentration of ropivacaine infusion can reduce postoperative opioid consumption but without any important differences in pain scores, side effects, or functional outcomes. These pilot data may be used to inform the statistical power of future randomized trials. LEVEL OF EVIDENCE: Level II, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
Respiratory syncytial virus (RSV) infection is the leading cause of hospitalisation for respiratory diseases among children under 5 years old. The aim of this study was to analyse RSV seasonality in the five distinct regions of Brazil using time series analysis (wavelet and Fourier series) of the following indicators: monthly positivity of the immunofluorescence reaction for RSV identified by virologic surveillance system, and rate of hospitalisations per bronchiolitis and pneumonia due to RSV in children under 5 years old (codes CID-10 J12.1, J20.5, J21.0 and J21.9). A total of 12,501 samples with 11.6% positivity for RSV (95% confidence interval 11 - 12.2), varying between 7.1 and 21.4% in the five Brazilian regions, was analysed. A strong trend for annual cycles with a stable stationary pattern in the five regions was identified through wavelet analysis of the indicators. The timing of RSV activity by Fourier analysis was similar between the two indicators analysed and showed regional differences. This study reinforces the importance of adjusting the immunisation period for high risk population with the monoclonal antibody palivizumab taking into account regional differences in seasonality of RSV.
Resumo:
PURPOSE: The aim of this study was to examine whether lipid oxidation predominates during 3 h of postexercise recovery in high-intensity interval exercise as compared with moderate-intensity continuous exercise on a cycle ergometer in fit young men (n = 12; 24.6 +/- 0.6 yr). METHODS: The energy substrate partitioning was evaluated during and after high-intensity submaximal interval exercise (INT, 1-min intervals at 80% of maximal aerobic power output [Wmax] with an intervening 1 min of active recovery at 40% Wmax) and 60-min moderate-intensity continuous exercise at 45% of maximal oxygen uptake (C45%) as well as a time-matched resting control trial (CON). Exercise bouts were matched for mechanical work output. RESULTS: During exercise, a significantly greater contribution of CHO and a lower contribution of lipid to energy expenditure were found in INT (512.7 +/- 26.6 and 41.0 +/- 14.0 kcal, respectively) than in C45% (406.3 +/- 21.2 and 170.3 +/- 24.0 kcal, respectively; P < 0.001) despite similar overall energy expenditure in both exercise trials (P = 0.13). During recovery, there were no significant differences between INT and C45% in substrate turnover and oxidation (P > 0.05). On the other hand, the mean contribution of lipids to energy yield was significantly higher after exercise trials (C45% = 61.3 +/- 4.2 kcal; INT = 66.7 +/- 4.7 kcal) than after CON (51.5 +/- 3.4 kcal; P < 0.05). CONCLUSIONS: These findings show that lipid oxidation during postexercise recovery was increased by a similar amount on two isoenergetic exercise bouts of different forms and intensities compared with the time-matched no-exercise control trial.