69 resultados para average complexity
Resumo:
Global complexity of 47-channel resting electroencephalogram (EEG) of healthy young volunteers was studied after intake of a single dose of a nootropic drug (piracetam, Nootropil® UCB Pharma) in 12 healthy volunteers. Four treatment levels were used: 2.4, 4.8, 9.6 g piracetam and placebo. Brain electric activity was assessed through Global Dimensional Complexity and Global Omega-Complexity as quantitative measures of the complexity of the trajectory of multichannel EEG in state space. After oral ingestion (1–1.5 h), both measures showed significant decreases from placebo to 2.4 g piracetam. In addition, Global Dimensional Complexity showed a significant return to placebo values at 9.6 g piracetam. The results indicate that a single dose of piracetam dose-dependently affects the spontaneous EEG in normal volunteers, showing effects at the lowest treatment level. The decreased EEG complexity is interpreted as increased cooperativity of brain functional processes.
Resumo:
The neurocognitive processes underlying the formation and maintenance of paranormal beliefs are important for understanding schizotypal ideation. Behavioral studies indicated that both schizotypal and paranormal ideation are based on an overreliance on the right hemisphere, whose coarse rather than focussed semantic processing may favor the emergence of 'loose' and 'uncommon' associations. To elucidate the electrophysiological basis of these behavioral observations, 35-channel resting EEG was recorded in pre-screened female strong believers and disbelievers during resting baseline. EEG data were subjected to FFT-Dipole-Approximation analysis, a reference-free frequency-domain dipole source modeling, and Regional (hemispheric) Omega Complexity analysis, a linear approach estimating the complexity of the trajectories of momentary EEG map series in state space. Compared to disbelievers, believers showed: more right-located sources of the beta2 band (18.5-21 Hz, excitatory activity); reduced interhemispheric differences in Omega complexity values; higher scores on the Magical Ideation scale; more general negative affect; and more hypnagogic-like reveries after a 4-min eyes-closed resting period. Thus, subjects differing in their declared paranormal belief displayed different active, cerebral neural populations during resting, task-free conditions. As hypothesized, believers showed relatively higher right hemispheric activation and reduced hemispheric asymmetry of functional complexity. These markers may constitute the neurophysiological basis for paranormal and schizotypal ideation.
Resumo:
Global complexity of spontaneous brain electric activity was studied before and after chewing gum without flavor and with 2 different flavors. One-minute, 19-channel, eyes-closed electroencephalograms (EEG) were recorded from 20 healthy males before and after using 3 types of chewing gum: regular gum containing sugar and aromatic additives, gum containing 200 mg theanine (a constituent of Japanese green tea), and gum base (no sugar, no aromatic additives); each was chewed for 5 min in randomized sequence. Brain electric activity was assessed through Global Omega (Ω)-Complexity and Global Dimensional Complexity (GDC), quantitative measures of complexity of the trajectory of EEG map series in state space; their differences from pre-chewing data were compared across gum-chewing conditions. Friedman Anova (p < 0.043) showed that effects on Ω-Complexity differed significantly between conditions and differences were maximal between gum base and theanine gum. No differences were found using GDC. Global Omega-Complexity appears to be a sensitive measure for subtle, central effects of chewing gum with and without flavor.
Resumo:
Introduction: Nocturnal dreams can be considered as a kind of simulation of the real world on a higher cognitive level (Erlacher & Schredl, 2008). Within lucid dreams, the dreamer is aware of the dream state and thus able to control the ongoing dream content. Previous studies could demonstrate that it is possible to practice motor tasks during lucid dreams and doing so improved performance while awake (Erlacher & Schredl, 2010). Even though lucid dream practice might be a promising kind of cognitive rehearsal in sports, little is known about the characteristics of actions in lucid dreams. The purpose of the present study was to explore the relationship between time in dreams and wakefulness because in an earlier study (Erlacher & Schredl, 2004) we found that performing squads took lucid dreamers 44.5 % more time than in the waking state while for counting the same participants showed no differences between dreaming and wakefulness. To find out if the task modality, the task length or the task complexity require longer times in lucid dreams than in wakefulness three experiments were conducted. Methods: In the first experiment five proficient lucid dreamers spent two to three non-consecutive nights in the sleep laboratory with polysomnographic recording to control for REM sleep and determine eye signals. Participants counted from 1-10, 1-20 and 1-30 in wakefulness and in their lucid dreams. While dreaming they marked onset of lucidity as well as beginning and end of the counting task with a Left-Right-Left-Right eye movement and reported their dreams after being awakened. The same procedure was used for the second experiment with seven lucid dreamers except that they had to walk 10, 20 or 30 steps. In the third experiment nine participants performed an exercise involving gymnastics elements such as various jumps and a roll. To control for length of the task the gymnastic exercise in the waking state lasted about the same time as walking 10 steps. Results: As a general result we found – as in the study before – that performing a task in the lucid dream requires more time than in wakefulness. This tendency was found for all three tasks. However, there was no difference for the task modality (counting vs. motor task). Also the relative time for the different lengths of the tasks showed no difference. And finally, the more complex motor task (gymnastic routine) did not require more time in lucid dreams than the simple motor task. Discussion/Conclusion: The results showed that there is a robust effect of time in lucid dreams compared to wakefulness. The three experiments could not explain that those differences are caused by task modality, task length or task complexity. Therefore further possible candidates needs to be investigated e.g. experience in lucid dreaming or psychological variables. References: Erlacher, D. & Schredl, M. (2010). Practicing a motor task in a lucid dream enhances subsequent performance: A pilot study. The Sport Psychologist, 24(2), 157-167. Erlacher, D. & Schredl, M. (2008). Do REM (lucid) dreamed and executed actions share the same neural substrate? International Journal of Dream Research, 1(1), 7-13. Erlacher, D. & Schredl, M. (2004). Time required for motor activity in lucid dreams. Perceptual and Motor Skills, 99, 1239-1242.
Resumo:
CONTEXT The necessity of specific intervention components for the successful treatment of patients with posttraumatic stress disorder is the subject of controversy. OBJECTIVE To investigate the complexity of clinical problems as a moderator of relative effects between specific and nonspecific psychological interventions. METHODS We included 18 randomized controlled trials, directly comparing specific and nonspecific psychological interventions. We conducted moderator analyses, including the complexity of clinical problems as predictor. RESULTS Our results have confirmed the moderate superiority of specific over nonspecific psychological interventions; however, the superiority was small in studies with complex clinical problems and large in studies with noncomplex clinical problems. CONCLUSIONS For patients with complex clinical problems, our results suggest that particular nonspecific psychological interventions may be offered as an alternative to specific psychological interventions. In contrast, for patients with noncomplex clinical problems, specific psychological interventions are the best treatment option.
Resumo:
Microstructures and textures of calcite mylonites from the Morcles nappe large-scale shear zone in southwestern Switzerland develop principally as a function of 1) extrinsic physical parameters including temperature, stress, strain, strain rate and 2) intrinsic parameters, such as mineral composition. We collected rock samples at a single location from this shear zone, on which laboratory ultrasonic velocities, texture and microstructures were investigated and quantified. The samples had different concentration of secondary mineral phases (< 5 up to 40 vol.%). Measured seismic P wave anisotropy ranges from 6.5% for polyphase mylonites (~ 40 vol.%) to 18.4% in mylonites with < 5 vol.% secondary phases. Texture strength of calcite is the main factor governing the seismic P wave anisotropy. Measured S wave splitting is generally highest in the foliation plane, but its origin is more difficult to explain solely by calcite texture. Additional texture measurements were made on calcite mylonites with low concentration of secondary phases (≤ 10 vol.%) along the metamorphic gradient of the shear zone (15 km distance). A systematic increase in texture strength is observed moving from the frontal part of the shear zone (anchimetamorphism; 280 °C) to the higher temperature, basal part (greenschist facies; 350–400 °C). Calculated P wave velocities become increasingly anisotropic towards the high-strain part of the nappe, from an average of 5.8% in the frontal part to 13.2% in the root of the basal part. Secondary phases raise an additional complexity, and may act either to increase or decrease seismic anisotropy of shear zone mylonites. In light of our findings we reinterpret the origin of some seismically reflective layers in the Grône–Zweisimmen line in southwestern Switzerland (PNR20 Swiss National Research Program). We hypothesize that reflections originate in part from the lateral variation in textural and microstructural arrangement of calcite mylonites in shear zones.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
Radiocarbon production, solar activity, total solar irradiance (TSI) and solar-induced climate change are reconstructed for the Holocene (10 to 0 kyr BP), and TSI is predicted for the next centuries. The IntCal09/SHCal04 radiocarbon and ice core CO2 records, reconstructions of the geomagnetic dipole, and instrumental data of solar activity are applied in the Bern3D-LPJ, a fully featured Earth system model of intermediate complexity including a 3-D dynamic ocean, ocean sediments, and a dynamic vegetation model, and in formulations linking radiocarbon production, the solar modulation potential, and TSI. Uncertainties are assessed using Monte Carlo simulations and bounding scenarios. Transient climate simulations span the past 21 thousand years, thereby considering the time lags and uncertainties associated with the last glacial termination. Our carbon-cycle-based modern estimate of radiocarbon production of 1.7 atoms cm−2 s−1 is lower than previously reported for the cosmogenic nuclide production model by Masarik and Beer (2009) and is more in-line with Kovaltsov et al. (2012). In contrast to earlier studies, periods of high solar activity were quite common not only in recent millennia, but throughout the Holocene. Notable deviations compared to earlier reconstructions are also found on decadal to centennial timescales. We show that earlier Holocene reconstructions, not accounting for the interhemispheric gradients in radiocarbon, are biased low. Solar activity is during 28% of the time higher than the modern average (650 MeV), but the absolute values remain weakly constrained due to uncertainties in the normalisation of the solar modulation to instrumental data. A recently published solar activity–TSI relationship yields small changes in Holocene TSI of the order of 1 W m−2 with a Maunder Minimum irradiance reduction of 0.85 ± 0.16 W m−2. Related solar-induced variations in global mean surface air temperature are simulated to be within 0.1 K. Autoregressive modelling suggests a declining trend of solar activity in the 21st century towards average Holocene conditions.
Resumo:
Previous studies have either exclusively used annual tree-ring data or have combined tree-ring series with other, lower temporal resolution proxy series. Both approaches can lead to significant uncertainties, as tree-rings may underestimate the amplitude of past temperature variations, and the validity of non-annual records cannot be clearly assessed. In this study, we assembled 45 published Northern Hemisphere (NH) temperature proxy records covering the past millennium, each of which satisfied 3 essential criteria: the series must be of annual resolution, span at least a thousand years, and represent an explicit temperature signal. Suitable climate archives included ice cores, varved lake sediments, tree-rings and speleothems. We reconstructed the average annual land temperature series for the NH over the last millennium by applying 3 different reconstruction techniques: (1) principal components (PC) plus second-order autoregressive model (AR2), (2) composite plus scale (CPS) and (3) regularized errors-in-variables approach (EIV). Our reconstruction is in excellent agreement with 6 climate model simulations (including the first 5 models derived from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and an earth system model of intermediate complexity (LOVECLIM), showing similar temperatures at multi-decadal timescales; however, all simulations appear to underestimate the temperature during the Medieval Warm Period (MWP). A comparison with other NH reconstructions shows that our results are consistent with earlier studies. These results indicate that well-validated annual proxy series should be used to minimize proxy-based artifacts, and that these proxy series contain sufficient information to reconstruct the low-frequency climate variability over the past millennium.