939 resultados para Mean occupancy time
Resumo:
Rationale: Focal onset epileptic seizures are due to abnormal interactions between distributed brain areas. By estimating the cross-correlation matrix of multi-site intra-cerebral EEG recordings (iEEG), one can quantify these interactions. To assess the topology of the underlying functional network, the binary connectivity matrix has to be derived from the cross-correlation matrix by use of a threshold. Classically, a unique threshold is used that constrains the topology [1]. Our method aims to set the threshold in a data-driven way by separating genuine from random cross-correlation. We compare our approach to the fixed threshold method and study the dynamics of the functional topology. Methods: We investigate the iEEG of patients suffering from focal onset seizures who underwent evaluation for the possibility of surgery. The equal-time cross-correlation matrices are evaluated using a sliding time window. We then compare 3 approaches assessing the corresponding binary networks. For each time window: * Our parameter-free method derives from the cross-correlation strength matrix (CCS)[2]. It aims at disentangling genuine from random correlations (due to finite length and varying frequency content of the signals). In practice, a threshold is evaluated for each pair of channels independently, in a data-driven way. * The fixed mean degree (FMD) uses a unique threshold on the whole connectivity matrix so as to ensure a user defined mean degree. * The varying mean degree (VMD) uses the mean degree of the CCS network to set a unique threshold for the entire connectivity matrix. * Finally, the connectivity (c), connectedness (given by k, the number of disconnected sub-networks), mean global and local efficiencies (Eg, El, resp.) are computed from FMD, CCS, VMD, and their corresponding random and lattice networks. Results: Compared to FMD and VMD, CCS networks present: *topologies that are different in terms of c, k, Eg and El. *from the pre-ictal to the ictal and then post-ictal period, topological features time courses that are more stable within a period, and more contrasted from one period to the next. For CCS, pre-ictal connectivity is low, increases to a high level during the seizure, then decreases at offset. k shows a ‘‘U-curve’’ underlining the synchronization of all electrodes during the seizure. Eg and El time courses fluctuate between the corresponding random and lattice networks values in a reproducible manner. Conclusions: The definition of a data-driven threshold provides new insights into the topology of the epileptic functional networks.
Resumo:
Fine roots are the most dynamic portion of a plant's root system and a major source of soil organic matter. By altering plant species diversity and composition, soil conditions and nutrient availability, and consequently belowground allocation and dynamics of root carbon (C) inputs, land-use and management changes may influence organic C storage in terrestrial ecosystems. In three German regions, we measured fine root radiocarbon (14C) content to estimate the mean time since C in root tissues was fixed from the atmosphere in 54 grassland and forest plots with different management and soil conditions. Although root biomass was on average greater in grasslands 5.1 ± 0.8 g (mean ± SE, n = 27) than in forests 3.1 ± 0.5 g (n = 27) (p < 0.05), the mean age of C in fine roots in forests averaged 11.3 ± 1.8 yr and was older and more variable compared to grasslands 1.7 ± 0.4 yr (p < 0.001). We further found that management affects the mean age of fine root C in temperate grasslands mediated by changes in plant species diversity and composition. Fine root mean C age is positively correlated with plant diversity (r = 0.65) and with the number of perennial species (r = 0.77). Fine root mean C age in grasslands was also affected by study region with averages of 0.7 ± 0.1 yr (n = 9) on mostly organic soils in northern Germany and of 1.8 ± 0.3 yr (n = 9) and 2.6 ± 0.3 (n = 9) in central and southern Germany (p < 0.05). This was probably due to differences in soil nutrient contents and soil moisture conditions between study regions, which affected plant species diversity and the presence of perennial species. Our results indicate more long-lived roots or internal redistribution of C in perennial species and suggest linkages between fine root C age and management in grasslands. These findings improve our ability to predict and model belowground C fluxes across broader spatial scales.
Resumo:
BACKGROUND & AIMS Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
Wireless Mesh Networks (WMNs) are increasingly deployed to enable thousands of users to share, create, and access live video streaming with different characteristics and content, such as video surveillance and football matches. In this context, there is a need for new mechanisms for assessing the quality level of videos because operators are seeking to control their delivery process and optimize their network resources, while increasing the user’s satisfaction. However, the development of in-service and non-intrusive Quality of Experience assessment schemes for real-time Internet videos with different complexity and motion levels, Group of Picture lengths, and characteristics, remains a significant challenge. To address this issue, this article proposes a non-intrusive parametric real-time video quality estimator, called MultiQoE that correlates wireless networks’ impairments, videos’ characteristics, and users’ perception into a predicted Mean Opinion Score. An instance of MultiQoE was implemented in WMNs and performance evaluation results demonstrate the efficiency and accuracy of MultiQoE in predicting the user’s perception of live video streaming services when compared to subjective, objective, and well-known parametric solutions.
Resumo:
The COSMIC-2 mission is a follow-on mission of the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) with an upgraded payload for improved radio occultation (RO) applications. The objective of this paper is to develop a near-real-time (NRT) orbit determination system, called NRT National Chiao Tung University (NCTU) system, to support COSMIC-2 in atmospheric applications and verify the orbit product of COSMIC. The system is capable of automatic determinations of the NRT GPS clocks and LEO orbit and clock. To assess the NRT (NCTU) system, we use eight days of COSMIC data (March 24-31, 2011), which contain a total of 331 GPS observation sessions and 12 393 RO observable files. The parallel scheduling for independent GPS and LEO estimations and automatic time matching improves the computational efficiency by 64% compared to the sequential scheduling. Orbit difference analyses suggest a 10-cm accuracy for the COSMIC orbits from the NRT (NCTU) system, and it is consistent as the NRT University Corporation for Atmospheric Research (URCA) system. The mean velocity accuracy from the NRT orbits of COSMIC is 0.168 mm/s, corresponding to an error of about 0.051 μrad in the bending angle. The rms differences in the NRT COSMIC clock and in GPS clocks between the NRT (NCTU) and the postprocessing products are 3.742 and 1.427 ns. The GPS clocks determined from a partial ground GPS network [from NRT (NCTU)] and a full one [from NRT (UCAR)] result in mean rms frequency stabilities of 6.1E-12 and 2.7E-12, respectively, corresponding to range fluctuations of 5.5 and 2.4 cm and bending angle errors of 3.75 and 1.66 μrad .
Resumo:
Geographic health planning analyses, such as service area calculations, are hampered by a lack of patient-specific geographic data. Using the limited patient address information in patient management systems, planners analyze patient origin based on home address. But activity space research done sparingly in public health and extensively in non-health related arenas uses multiple addresses per person when analyzing accessibility. Also, health care access research has shown that there are many non-geographic factors that influence choice of provider. Most planning methods, however, overlook non-geographic factors influencing choice of provider, and the limited data mean the analyses can only be related to home address. This research attempted to determine to what extent geography plays a part in patient choice of provider and to determine if activity space data can be used to calculate service areas for primary care providers. During Spring 2008, a convenience sample of 384 patients of a locally-funded Community Health Center in Houston, Texas, completed a survey that asked about what factors are important when he or she selects a health care provider. A subset of this group (336) also completed an activity space log that captured location and time data on the places where the patient regularly goes. Survey results indicate that for this patient population, geography plays a role in their choice of health care provider, but it is not the most important reason for choosing a provider. Other factors for choosing a health care provider such as the provider offering “free or low cost visits”, meeting “all of the patient’s health care needs”, and seeing “the patient quickly” were all ranked higher than geographic reasons. Analysis of the patient activity locations shows that activity spaces can be used to create service areas for a single primary care provider. Weighted activity-space-based service areas have the potential to include more patients in the service area since more than one location per patient is used. Further analysis of the logs shows that a reduced set of locations by time and type could be used for this methodology, facilitating ongoing data collection for activity-space-based planning efforts.
Resumo:
Ecology and conservation require reliable data on the occurrence of animals and plants. A major source of bias is imperfect detection, which, however, can be corrected for by estimation of detectability. In traditional occupancy models, this requires repeat or multi-observer surveys. Recently, time-to-detection models have been developed as a cost-effective alternative, which requires no repeat surveys and hence costs could be halved. We compared the efficiency and reliability of time-to-detection and traditional occupancy models under varying survey effort. Two observers independently searched for 17 plant species in 44100m(2) Swiss grassland quadrats and recorded the time-to-detection for each species, enabling detectability to be estimated with both time-to-detection and traditional occupancy models. In addition, we gauged the relative influence on detectability of species, observer, plant height and two measures of abundance (cover and frequency). Estimates of detectability and occupancy under both models were very similar. Rare species were more likely to be overlooked; detectability was strongly affected by abundance. As a measure of abundance, frequency outperformed cover in its predictive power. The two observers differed significantly in their detection ability. Time-to-detection models were as accurate as traditional occupancy models, but their data easier to obtain; thus they provide a cost-effective alternative to traditional occupancy models for detection-corrected estimation of occurrence.
Resumo:
Systematic differences in circadian rhythmicity are thought to be a substantial factor determining inter-individual differences in fatigue and cognitive performance. The synchronicity effect (when time of testing coincides with the respective circadian peak period) seems to play an important role. Eye movements have been shown to be a reliable indicator of fatigue due to sleep deprivation or time spent on cognitive tasks. However, eye movements have not been used so far to investigate the circadian synchronicity effect and the resulting differences in fatigue. The aim of the present study was to assess how different oculomotor parameters in a free visual exploration task are influenced by: a) fatigue due to chronotypical factors (being a 'morning type' or an 'evening type'); b) fatigue due to the time spent on task. Eighteen healthy participants performed a free visual exploration task of naturalistic pictures while their eye movements were recorded. The task was performed twice, once at their optimal and once at their non-optimal time of the day. Moreover, participants rated their subjective fatigue. The non-optimal time of the day triggered a significant and stable increase in the mean visual fixation duration during the free visual exploration task for both chronotypes. The increase in the mean visual fixation duration correlated with the difference in subjectively perceived fatigue at optimal and non-optimal times of the day. Conversely, the mean saccadic speed significantly and progressively decreased throughout the duration of the task, but was not influenced by the optimal or non-optimal time of the day for both chronotypes. The results suggest that different oculomotor parameters are discriminative for fatigue due to different sources. A decrease in saccadic speed seems to reflect fatigue due to time spent on task, whereas an increase in mean fixation duration a lack of synchronicity between chronotype and time of the day.
Resumo:
INTRODUCTION: The objective of this study was to evaluate the effects of two different mean arterial blood pressure (MAP) targets on needs for resuscitation, organ dysfunction, mitochondrial respiration and inflammatory response in a long-term model of fecal peritonitis. METHODS: Twenty-four anesthetized and mechanically ventilated pigs were randomly assigned (n = 8/group) to a septic control group (septic-CG) without resuscitation until death or one of two groups with resuscitation performed after 12 hours of untreated sepsis for 48 hours, targeting MAP 50-60 mmHg (low-MAP) or 75-85 mmHg (high-MAP). RESULTS: MAP at the end of resuscitation was 56 ± 13 mmHg (mean ± SD) and 76 ± 17 mmHg respectively, for low-MAP and high-MAP groups. One animal each in high- and low-MAP groups, and all animals in septic-CG died (median survival time: 21.8 hours, inter-quartile range: 16.3-27.5 hours). Norepinephrine was administered to all animals of the high-MAP group (0.38 (0.21-0.56) mcg/kg/min), and to three animals of the low-MAP group (0.00 (0.00-0.25) mcg/kg/min; P = 0.009). The high-MAP group had a more positive fluid balance (3.3 ± 1.0 mL/kg/h vs. 2.3 ± 0.7 mL/kg/h; P = 0.001). Inflammatory markers, skeletal muscle ATP content and hemodynamics other than MAP did not differ between low- and high-MAP groups. The incidence of acute kidney injury (AKI) after 12 hours of untreated sepsis was, respectively for low- and high-MAP groups, 50% (4/8) and 38% (3/8), and in the end of the study 57% (4/7) and 0% (P = 0.026). In septic-CG, maximal isolated skeletal muscle mitochondrial Complex I, State 3 respiration increased from 1357 ± 149 pmol/s/mg to 1822 ± 385 pmol/s/mg, (P = 0.020). In high- and low-MAP groups, permeabilized skeletal muscle fibers Complex IV-state 3 respiration increased during resuscitation (P = 0.003). CONCLUSIONS: The MAP targets during resuscitation did not alter the inflammatory response, nor affected skeletal muscle ATP content and mitochondrial respiration. While targeting a lower MAP was associated with increased incidence of AKI, targeting a higher MAP resulted in increased net positive fluid balance and vasopressor load during resuscitation. The long-term effects of different MAP targets need to be evaluated in further studies.
Resumo:
Low-frequency "off-line" repetitive transcranial magnetic stimulation (rTMS) over the course of several minutes has attained considerable attention as a research tool in cognitive neuroscience due to its ability to induce functional disruptions of brain areas. This disruptive rTMS effect is highly valuable for revealing a causal relationship between brain and behavior. However, its influence on remote interconnected areas and, more importantly, the duration of the induced neurophysiological effects, remain unknown. These aspects are critical for a study design in the context of cognitive neuroscience. In order to investigate these issues, 12 healthy male subjects underwent 8 H(2)(15)O positron emission tomography (PET) scans after application of long-train low-frequency rTMS to the right dorsolateral prefrontal cortex (DLPFC). Immediately after the stimulation train, regional cerebral blood flow (rCBF) increases were present under the stimulation site as well as in other prefrontal cortical areas, including the ventrolateral prefrontal cortex (VLPFC) ipsilateral to the stimulation site. The mean increases in rCBF returned to baseline within 9 min. The duration of this unilateral prefrontal rTMS effect on rCBF is of particular interest to those who aim to influence behavior in cognitive paradigms that use an "off-line" approach.
Resumo:
We used real-time LDI to study regional variations in microcirculatory perfusion in healthy candidates to establish a new methodology for global perfusion body mapping that is based on intra-individual perfusion index ratios. Our study included 74 (37 female) healthy volunteers aged between 22 and 30 years (mean 24.49). Imaging was performed using a recent microcirculation-imaging camera (EasyLDI) for different body regions of each volunteer. The perfusion values were reported in Arbitrary Perfusion Units (APU). The relative perfusion indexes for each candidate's body region were then obtained by normalization with the perfusion value of the forehead. Basic parameters such as weight, height, and blood pressure were also measured and analyzed. The highest mean perfusion value was reported in the forehead area (259.21APU). Mean perfusion in the measured parts of the body correlated positively with mean forehead value, while there was no significant correlation between forehead blood perfusion values and room temperature, BMI, systolic blood pressure and diastolic blood pressure (p=0.420, 0.623, 0.488, 0.099, respectively). Analysis of the data showed that perfusion indexes were not significantly different between male and female volunteers except for the ventral upper arm area (p=.001). LDI is a non-invasive, fast technique that opens several avenues for clinical applications. The mean perfusion indexes are useful in clinical practice for monitoring patients before and after surgical interventions. Perfusion values can be predicted for different body parts for patients only by taking the forehead perfusion value and using the perfusion index ratios to obtain expected normative perfusion values.
Resumo:
Background. In the field of information technology (IT) time pressure is common. Working with tight deadlines together on the same task increases the risk of social stressors referring to tensions and conflicts at work. Purpose. This field study tested both the association of time pressure and social stressors with blood pressure during work. Method. Seven employees – staff of a small IT enterprise – participated in repeated ambulatory blood pressure measurements over the course of one week. Time pressure and social stressors at work were assessed by questionnaire at the beginning of the study. Results. Multilevel regression analyses of 138 samples revealed higher levels of time pressure to be related to marginally significant increases in mean arterial blood pressure at noon and in the afternoon. In addition, higher levels of social stressors at work were significantly associated to elevated mean arterial pressure in the afternoon. Conclusion. Findings support the view that threats to the social self play an important role in occupational health.
Territory Occupancy and Parental Quality as Proxies for Spatial Prioritization of Conservation Areas
Resumo:
In order to maximize their fitness, individuals aim at choosing territories offering the most appropriate combination of resources. As population size fluctuates in time, the frequency of breeding territory occupancy reflects territory quality. We investigated the relationships between the frequency of territory occupancy (2002–2009) vs. habitat characteristics, prey abundance, reproductive success and parental traits in hoopoes Upupa epops L., with the objective to define proxies for the delineation of conservation priority areas. We predicted that the distribution of phenotypes is despotic and sought for phenotypic characteristics expressing dominance. Our findings support the hypothesis of a despotic distribution. Territory selection was non-random: frequently occupied territories were settled earlier in the season and yielded higher annual reproductive success, but the frequency of territory occupancy could not be related to any habitat characteristics. Males found in frequently occupied territories showed traits expressing dominance (i.e. larger body size and mass, and older age). In contrast, morphological traits of females were not related to the frequency of territory occupancy, suggesting that territory selection and maintenance were essentially a male's task. Settlement time in spring, reproductive success achieved in a given territory, as well as phenotypic traits and age of male territory holders reflected territory quality, providing good proxies for assessing priority areas for conservation management.
Resumo:
PURPOSE To evaluate the accuracy, safety, and efficacy of cervical nerve root injection therapy using magnetic resonance guidance in an open 1.0 T MRI system. METHODS Between September 2009 and April 2012, a total of 21 patients (9 men, 12 women; mean age 47.1 ± 11.1 years) underwent MR-guided cervical periradicular injection for cervical radicular pain in an open 1.0 T system. An interactive proton density-weighted turbo spin echo (PDw TSE) sequence was used for real-time guidance of the MR-compatible 20-gauge injection needle. Clinical outcome was evaluated on a verbal numeric rating scale (VNRS) before injection therapy (baseline) and at 1 week and 1, 3, and 6 months during follow-up. RESULTS All procedures were technically successful and there were no major complications. The mean preinterventional VNRS score was 7.42 and exhibited a statistically significant decrease (P < 0.001) at all follow-up time points: 3.86 ± 1.53 at 1 week, 3.21 ± 2.19 at 1 month, 2.58 ± 2.54 at 3 months, and 2.76 ± 2.63 at 6 months. At 6 months, 14.3 % of the patients reported complete resolution of radicular pain and 38.1 % each had either significant (4-8 VNRS score points) or mild (1-3 VNRS score points) relief of pain; 9.5 % experienced no pain relief. CONCLUSION Magnetic resonance fluoroscopy-guided periradicular cervical spine injection is an accurate, safe, and efficacious treatment option for patients with cervical radicular pain. The technique may be a promising alternative to fluoroscopy- or CT-guided injections of the cervical spine, especially in young patients and in patients requiring repeat injections.