910 resultados para complexity metrics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Schizophrenic symptoms commonly are felt to indicate a loosened coordination, i.e. a decreased connectivity of brain processes. Methods: To address this hypothesis directly, global and regional multichannel electroencephalographic (EEG) complexities (omega complexity and dimensional complexity) and single channel EEG dimensional complexities were calculated from 19-channel EEG data from 9 neuroleptic-naive, first-break, acute schizophrenics and 9 age- and sex-matched controls. Twenty artifact-free 2 second EEG epochs during resting with closed eyes were analyzed (2–30 Hz bandpass, average reference for global and regional complexities, local EEG gradient time series for single channels). Results: Anterior regional Omega-Complexity was significantly increased in schizophrenics compared with controls (p < 0.001) and anterior regional Dimensional Complexity showed a trend for increase. Single channel Dimensional Complexity of local gradient waveshapes was prominently increased in the schizophrenics at the right precentral location (p = 0.003). Conclusions: The results indicate a loosened cooperativity or coordination (vice versa: an increased independence) of the active brain processes in the anterior brain regions of the schizophrenics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global complexity of spontaneous brain electric activity was studied before and after chewing gum without flavor and with 2 different flavors. One-minute, 19-channel, eyes-closed electroencephalograms (EEG) were recorded from 20 healthy males before and after using 3 types of chewing gum: regular gum containing sugar and aromatic additives, gum containing 200 mg theanine (a constituent of Japanese green tea), and gum base (no sugar, no aromatic additives); each was chewed for 5 min in randomized sequence. Brain electric activity was assessed through Global Omega (Ω)-Complexity and Global Dimensional Complexity (GDC), quantitative measures of complexity of the trajectory of EEG map series in state space; their differences from pre-chewing data were compared across gum-chewing conditions. Friedman Anova (p < 0.043) showed that effects on Ω-Complexity differed significantly between conditions and differences were maximal between gum base and theanine gum. No differences were found using GDC. Global Omega-Complexity appears to be a sensitive measure for subtle, central effects of chewing gum with and without flavor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Nocturnal dreams can be considered as a kind of simulation of the real world on a higher cognitive level (Erlacher & Schredl, 2008). Within lucid dreams, the dreamer is aware of the dream state and thus able to control the ongoing dream content. Previous studies could demonstrate that it is possible to practice motor tasks during lucid dreams and doing so improved performance while awake (Erlacher & Schredl, 2010). Even though lucid dream practice might be a promising kind of cognitive rehearsal in sports, little is known about the characteristics of actions in lucid dreams. The purpose of the present study was to explore the relationship between time in dreams and wakefulness because in an earlier study (Erlacher & Schredl, 2004) we found that performing squads took lucid dreamers 44.5 % more time than in the waking state while for counting the same participants showed no differences between dreaming and wakefulness. To find out if the task modality, the task length or the task complexity require longer times in lucid dreams than in wakefulness three experiments were conducted. Methods: In the first experiment five proficient lucid dreamers spent two to three non-consecutive nights in the sleep laboratory with polysomnographic recording to control for REM sleep and determine eye signals. Participants counted from 1-10, 1-20 and 1-30 in wakefulness and in their lucid dreams. While dreaming they marked onset of lucidity as well as beginning and end of the counting task with a Left-Right-Left-Right eye movement and reported their dreams after being awakened. The same procedure was used for the second experiment with seven lucid dreamers except that they had to walk 10, 20 or 30 steps. In the third experiment nine participants performed an exercise involving gymnastics elements such as various jumps and a roll. To control for length of the task the gymnastic exercise in the waking state lasted about the same time as walking 10 steps. Results: As a general result we found – as in the study before – that performing a task in the lucid dream requires more time than in wakefulness. This tendency was found for all three tasks. However, there was no difference for the task modality (counting vs. motor task). Also the relative time for the different lengths of the tasks showed no difference. And finally, the more complex motor task (gymnastic routine) did not require more time in lucid dreams than the simple motor task. Discussion/Conclusion: The results showed that there is a robust effect of time in lucid dreams compared to wakefulness. The three experiments could not explain that those differences are caused by task modality, task length or task complexity. Therefore further possible candidates needs to be investigated e.g. experience in lucid dreaming or psychological variables. References: Erlacher, D. & Schredl, M. (2010). Practicing a motor task in a lucid dream enhances subsequent performance: A pilot study. The Sport Psychologist, 24(2), 157-167. Erlacher, D. & Schredl, M. (2008). Do REM (lucid) dreamed and executed actions share the same neural substrate? International Journal of Dream Research, 1(1), 7-13. Erlacher, D. & Schredl, M. (2004). Time required for motor activity in lucid dreams. Perceptual and Motor Skills, 99, 1239-1242.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CONTEXT The necessity of specific intervention components for the successful treatment of patients with posttraumatic stress disorder is the subject of controversy. OBJECTIVE To investigate the complexity of clinical problems as a moderator of relative effects between specific and nonspecific psychological interventions. METHODS We included 18 randomized controlled trials, directly comparing specific and nonspecific psychological interventions. We conducted moderator analyses, including the complexity of clinical problems as predictor. RESULTS Our results have confirmed the moderate superiority of specific over nonspecific psychological interventions; however, the superiority was small in studies with complex clinical problems and large in studies with noncomplex clinical problems. CONCLUSIONS For patients with complex clinical problems, our results suggest that particular nonspecific psychological interventions may be offered as an alternative to specific psychological interventions. In contrast, for patients with noncomplex clinical problems, specific psychological interventions are the best treatment option.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The delineation of shifting cultivation landscapes using remote sensing in mountainous regions is challenging. On the one hand, there are difficulties related to the distinction of forest and fallow forest classes as occurring in a shifting cultivation landscape in mountainous regions. On the other hand, the dynamic nature of the shifting cultivation system poses problems to the delineation of landscapes where shifting cultivation occurs. We present a two-step approach based on an object-oriented classification of Advanced Land Observing Satellite, Advanced Visible and Near-Infrared Spectrometer (ALOS AVNIR) and Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) data and landscape metrics. When including texture measures in the object-oriented classification, the accuracy of forest and fallow forest classes could be increased substantially. Based on such a classification, landscape metrics in the form of land cover class ratios enabled the identification of crop-fallow rotation characteristics of the shifting cultivation land use practice. By classifying and combining these landscape metrics, shifting cultivation landscapes could be delineated using a single land cover dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational network analysis provides new methods to analyze the human connectome. Brain structural networks can be characterized by global and local metrics that recently gave promising insights for diagnosis and further understanding of neurological, psychiatric and neurodegenerative disorders. In order to ensure the validity of results in clinical settings the precision and repeatability of the networks and the associated metrics must be evaluated. In the present study, nineteen healthy subjects underwent two consecutive measurements enabling us to test reproducibility of the brain network and its global and local metrics. As it is known that the network topology depends on the network density, the effects of setting a common density threshold for all networks were also assessed. Results showed good to excellent repeatability for global metrics, while for local metrics it was more variable and some metrics were found to have locally poor repeatability. Moreover, between subjects differences were slightly inflated when the density was not fixed. At the global level, these findings confirm previous results on the validity of global network metrics as clinical biomarkers. However, the new results in our work indicate that the remaining variability at the local level as well as the effect of methodological characteristics on the network topology should be considered in the analysis of brain structural networks and especially in networks comparisons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Myxobacteria are single-celled, but social, eubacterial predators. Upon starvation they build multicellular fruiting bodies using a developmental program that progressively changes the pattern of cell movement and the repertoire of genes expressed. Development terminates with spore differentiation and is coordinated by both diffusible and cell-bound signals. The growth and development of Myxococcus xanthus is regulated by the integration of multiple signals from outside the cells with physiological signals from within. A collection of M. xanthus cells behaves, in many respects, like a multicellular organism. For these reasons M. xanthus offers unparalleled access to a regulatory network that controls development and that organizes cell movement on surfaces. The genome of M. xanthus is large (9.14 Mb), considerably larger than the other sequenced delta-proteobacteria. We suggest that gene duplication and divergence were major contributors to genomic expansion from its progenitor. More than 1,500 duplications specific to the myxobacterial lineage were identified, representing >15% of the total genes. Genes were not duplicated at random; rather, genes for cell-cell signaling, small molecule sensing, and integrative transcription control were amplified selectively. Families of genes encoding the production of secondary metabolites are overrepresented in the genome but may have been received by horizontal gene transfer and are likely to be important for predation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensity modulated radiation therapy (IMRT) is a technique that delivers a highly conformal dose distribution to a target volume while attempting to maximally spare the surrounding normal tissues. IMRT is a common treatment modality used for treating head and neck (H&N) cancers, and the presence of many critical structures in this region requires accurate treatment delivery. The Radiological Physics Center (RPC) acts as both a remote and on-site quality assurance agency that credentials institutions participating in clinical trials. To date, about 30% of all IMRT participants have failed the RPC’s remote audit using the IMRT H&N phantom. The purpose of this project is to evaluate possible causes of H&N IMRT delivery errors observed by the RPC, specifically IMRT treatment plan complexity and the use of improper dosimetry data from machines that were thought to be matched but in reality were not. Eight H&N IMRT plans with a range of complexity defined by total MU (1460-3466), number of segments (54-225), and modulation complexity scores (MCS) (0.181-0.609) were created in Pinnacle v.8m. These plans were delivered to the RPC’s H&N phantom on a single Varian Clinac. One of the IMRT plans (1851 MU, 88 segments, and MCS=0.469) was equivalent to the median H&N plan from 130 previous RPC H&N phantom irradiations. This average IMRT plan was also delivered on four matched Varian Clinac machines and the dose distribution calculated using a different 6MV beam model. Radiochromic film and TLD within the phantom were used to analyze the dose profiles and absolute doses, respectively. The measured and calculated were compared to evaluate the dosimetric accuracy. All deliveries met the RPC acceptance criteria of ±7% absolute dose difference and 4 mm distance-to-agreement (DTA). Additionally, gamma index analysis was performed for all deliveries using a ±7%/4mm and ±5%/3mm criteria. Increasing the treatment plan complexity by varying the MU, number of segments, or varying the MCS resulted in no clear trend toward an increase in dosimetric error determined by the absolute dose difference, DTA, or gamma index. Varying the delivery machines as well as the beam model (use of a Clinac 6EX 6MV beam model vs. Clinac 21EX 6MV model), also did not show any clear trend towards an increased dosimetric error using the same criteria indicated above.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Application of biogeochemical models to the study of marine ecosystems is pervasive, yet objective quantification of these models' performance is rare. Here, 12 lower trophic level models of varying complexity are objectively assessed in two distinct regions (equatorial Pacific and Arabian Sea). Each model was run within an identical one-dimensional physical framework. A consistent variational adjoint implementation assimilating chlorophyll-a, nitrate, export, and primary productivity was applied and the same metrics were used to assess model skill. Experiments were performed in which data were assimilated from each site individually and from both sites simultaneously. A cross-validation experiment was also conducted whereby data were assimilated from one site and the resulting optimal parameters were used to generate a simulation for the second site. When a single pelagic regime is considered, the simplest models fit the data as well as those with multiple phytoplankton functional groups. However, those with multiple phytoplankton functional groups produced lower misfits when the models are required to simulate both regimes using identical parameter values. The cross-validation experiments revealed that as long as only a few key biogeochemical parameters were optimized, the models with greater phytoplankton complexity were generally more portable. Furthermore, models with multiple zooplankton compartments did not necessarily outperform models with single zooplankton compartments, even when zooplankton biomass data are assimilated. Finally, even when different models produced similar least squares model-data misfits, they often did so via very different element flow pathways, highlighting the need for more comprehensive data sets that uniquely constrain these pathways.