920 resultados para Frequency-time transformation
Resumo:
In this investigation, attention is directed to the phases covered by a 28 per cent tin alloy. When the investigation was started, consideration was given to the possibility of making a Time - Temperature - Transformation curve for this particular alloy. As the work progressed and further research was carried on, this phase of the work was abandoned.
Resumo:
Radio frequency electromagnetic fields (RF-EMF) in our daily life are caused by numerous sources such as fixed site transmitters (e.g. mobile phone base stations) or indoor devices (e.g. cordless phones). The objective of this study was to develop a prediction model which can be used to predict mean RF-EMF exposure from different sources for a large study population in epidemiological research. We collected personal RF-EMF exposure measurements of 166 volunteers from Basel, Switzerland, by means of portable exposure meters, which were carried during one week. For a validation study we repeated exposure measurements of 31 study participants 21 weeks after the measurements of the first week on average. These second measurements were not used for the model development. We used two data sources as exposure predictors: 1) a questionnaire on potentially exposure relevant characteristics and behaviors and 2) modeled RF-EMF from fixed site transmitters (mobile phone base stations, broadcast transmitters) at the participants' place of residence using a geospatial propagation model. Relevant exposure predictors, which were identified by means of multiple regression analysis, were the modeled RF-EMF at the participants' home from the propagation model, housing characteristics, ownership of communication devices (wireless LAN, mobile and cordless phones) and behavioral aspects such as amount of time spent in public transports. The proportion of variance explained (R2) by the final model was 0.52. The analysis of the agreement between calculated and measured RF-EMF showed a sensitivity of 0.56 and a specificity of 0.95 (cut-off: 90th percentile). In the validation study, the sensitivity and specificity of the model were 0.67 and 0.96, respectively. We could demonstrate that it is feasible to model personal RF-EMF exposure. Most importantly, our validation study suggests that the model can be used to assess average exposure over several months.
Resumo:
BACKGROUND: Case series of patients with a diagnosis of thrombotic thrombocytopenic purpura (TTP) have reported different frequencies of human immunodeficiency virus (HIV) infection; some series suggest that HIV infection may cause TTP. METHODS: We systematically reviewed all reports of HIV infection in case series of patients with TTP. We analyzed data from the Oklahoma TTP-HUS (hemolytic uremic syndrome) Registry, an inception cohort of 362 consecutive patients, for 1989-2007. RESULTS: Nineteen case series reported the occurrence of HIV infection at the time of diagnosis of TTP in 0%-83% of patients; individual patient data were rarely described. The Oklahoma TTP-HUS Registry determined the HIV status at the time of diagnosis of TTP in 351 (97%) of 362 patients. HIV infection was documented in 6 (1.84%; 95% CI, 0.68%-4.01%) of 326 adult patients (age, 26-51 years); follow-up data were complete for all 6 patients. The period prevalence of HIV infection among all adults in the Oklahoma TTP-HUS Registry region for 1989-2007 was 0.30%. One patient had typical features of TTP with 5 relapses. Five patients had single episodes; in 4, the clinical features that had initially suggested the diagnosis of TTP were subsequently attributed to malignant hypertension (in 3 patients) and disseminated Kaposi sarcoma (in 1 patient). CONCLUSIONS: HIV infection, similar to other inflammatory conditions, may trigger acute episodes of TTP in susceptible patients. More commonly, acquired immunodeficiency syndrome-related disorders may mimic the clinical features of TTP. If the diagnosis of TTP is suggested in a patient with HIV infection, there should be careful evaluation for alternative diagnoses and cautious consideration of plasma exchange, the required treatment for TTP.
Resumo:
INTRODUCTION: Cystic fibrosis (CF) almost always leads to chronic airway infection with Pseudomonas aeruginosa. Despite advances in antibiotic therapy, after chronic infection rapid deterioration in lung function occurs, increasing morbidity and mortality. Prevention of infection by vaccination is desirable, but earlier trials produced disappointing results. The promising short term immunogenicity and safety of a new P. aeruginosa vaccine prompted us to evaluate its long term efficacy. We conducted a 10-year retrospective analysis of outcomes in a group of vaccinated patients. MATERIALS AND METHODS: In 1989-1990, 30 young children with CF, mean age 7 years, with no prior history of infection with P. aeruginosa, were vaccinated against P. aeruginosa with a polyvalent conjugate vaccine. We report the follow-up of 26 of these patients from 1989 to 2001. The patients were given yearly vaccine boosters. Comparisons were made with a CF patient control group matched for gender, age and, where possible, genetic mutation. Vaccinated patients and controls were attending a single CF clinic and received the same clinical management throughout the study period. Main outcomes were time to infection, proportion of patients infected, development of P. aeruginosa mucoid phenotype, lung function and body weight. RESULTS: The time to infection with P. aeruginosa was longer in the vaccination group than in the control group, and fewer vaccinated patients than controls became chronically infected (32% versus 72%; P < 0.001). The proportion of mucoid infections was higher in the control group (44%) than in the vaccinated group (25%). Patients >/=18 years of age at the end of the study had a lower mean forced expiratory volume at 1 s (FEV1) than did those 13-17 years of age, but this difference was small in the vaccinated group (73.6% versus 83.7%) compared with the controls (48.0% versus 78.7%). In the >/=18 year age category the mean FEV1% at 10 years was 73.6% (vaccinated) and 48.0% (controls) (P < 0.05). In the vaccinated group only 11 (44%) of 25 patients were underweight at the 10-year follow-up compared with 18 (72%) of 25 at the beginning of the study. In the control group 17 (68%) of 25 patients were underweight at 10-year follow-up compared with 16 (64%) of 25 at the beginning of the study. CONCLUSION: Regular vaccination of young CF patients for a period of 10 years with a polyvalent conjugate vaccine reduced the frequency of chronic infection with P. aeruginosa. This was associated with better preservation of lung function. Vaccinated patients gained more weight during the study period, a possible indication of an improved overall health status.
Resumo:
A publication entitled “A default mode of brain function” initiated a new way of looking at functional imaging data. In this PET study the authors discussed the often-observed consistent decrease of brain activation in a variety of tasks as compared with the baseline. They suggested that this deactivation is due to a task-induced suspension of a default mode of brain function that is active during rest, i.e. that there exists intrinsic well-organized brain activity during rest in several distinct brain regions. This suggestion led to a large number of imaging studies on the resting state of the brain and to the conclusion that the study of this intrinsic activity is crucial for understanding how the brain works. The fact that the brain is active during rest has been well known from a variety of EEG recordings for a very long time. Different states of the brain in the sleep–wake continuum are characterized by typical patterns of spontaneous oscillations in different frequency ranges and in different brain regions. Best studied are the evolving states during the different sleep stages, but characteristic EEG oscillation patterns have also been well described during awake periods (see Chapter 1 for details). A highly recommended comprehensive review on the brain's default state defined by oscillatory electrical brain activities is provided in the recent book by György Buzsaki, showing how these states can be measured by electrophysiological procedures at the global brain level as well as at the local cellular level.
Resumo:
Rationale: Focal onset epileptic seizures are due to abnormal interactions between distributed brain areas. By estimating the cross-correlation matrix of multi-site intra-cerebral EEG recordings (iEEG), one can quantify these interactions. To assess the topology of the underlying functional network, the binary connectivity matrix has to be derived from the cross-correlation matrix by use of a threshold. Classically, a unique threshold is used that constrains the topology [1]. Our method aims to set the threshold in a data-driven way by separating genuine from random cross-correlation. We compare our approach to the fixed threshold method and study the dynamics of the functional topology. Methods: We investigate the iEEG of patients suffering from focal onset seizures who underwent evaluation for the possibility of surgery. The equal-time cross-correlation matrices are evaluated using a sliding time window. We then compare 3 approaches assessing the corresponding binary networks. For each time window: * Our parameter-free method derives from the cross-correlation strength matrix (CCS)[2]. It aims at disentangling genuine from random correlations (due to finite length and varying frequency content of the signals). In practice, a threshold is evaluated for each pair of channels independently, in a data-driven way. * The fixed mean degree (FMD) uses a unique threshold on the whole connectivity matrix so as to ensure a user defined mean degree. * The varying mean degree (VMD) uses the mean degree of the CCS network to set a unique threshold for the entire connectivity matrix. * Finally, the connectivity (c), connectedness (given by k, the number of disconnected sub-networks), mean global and local efficiencies (Eg, El, resp.) are computed from FMD, CCS, VMD, and their corresponding random and lattice networks. Results: Compared to FMD and VMD, CCS networks present: *topologies that are different in terms of c, k, Eg and El. *from the pre-ictal to the ictal and then post-ictal period, topological features time courses that are more stable within a period, and more contrasted from one period to the next. For CCS, pre-ictal connectivity is low, increases to a high level during the seizure, then decreases at offset. k shows a ‘‘U-curve’’ underlining the synchronization of all electrodes during the seizure. Eg and El time courses fluctuate between the corresponding random and lattice networks values in a reproducible manner. Conclusions: The definition of a data-driven threshold provides new insights into the topology of the epileptic functional networks.
Resumo:
In the Andean highlands, indigenous environmental knowledge is currently undergoing major changes as a result of various external and internal factors. As in other parts of the world, an overall process of erosion of local knowledge can be observed. In response to this trend, some initiatives that adopt a biocultural approach aim at actively strengthening local identities and revalorizing indigenous environmental knowledge and practices, assuming that such practices can contribute to more sustainable management of biodiversity. However, these initiatives usually lack a sound research basis, as few studies have focused on the dynamics of indigenous environmental knowledge in the Andes and on its links with biodiversity management. Against this background, the general objective of this research project was to contribute to the understanding of the dynamics of indigenous environmental knowledge in the Andean highlands of Peru and Bolivia by investigating how local medicinal knowledge is socially differentiated within rural communities, how it is transformed, and which external and internal factors influence these transformation processes. The project adopted an actor-oriented perspective and emphasized the concept of knowledge dialogue by analyzing the integration of traditional and formal medicinal systems within family therapeutic strategies. It also aimed at grasping some of the links between the dynamics of medicinal knowledge and the types of land use systems and biodiversity management. Research was conducted in two case study areas of the Andes, both Quechua-speaking and situated in comparable agro-ecological production belts - Pitumarca District, Department of Cusco (Southern Peruvian Highlands) and the Tunari National Park, Department of Cochabamba (Bolivian inner-Andean valleys). In each case study area, the land use systems and strategies of 18 families from two rural communities, their environmental knowledge related to medicine and to the local therapeutic flora, and an appreciation of the dynamics of this knowledge were assessed. Data were collected through a combination of disciplinary and participatory action-research methods. It was mostly analyzed using qualitative methods, though some quantitative ethnobotanical methods were also used. In both case studies, traditional medicine still constitutes the preferred option for the families interviewed, independently of their age, education level, economic status, religion, or migration status. Surprisingly and contrary to general assertions among local NGOs and researchers, results show that there is a revival of Andean medicine within the younger generation, who have greater knowledge of medicinal plants than the previous one, value this knowledge as an important element of their way of life and relationship with “Mother Earth” (Pachamama), and, at least in the Bolivian case, prefer to consult the traditional healer rather than go to the health post. Migration to the urban centres and the Amazon lowlands, commonly thought to be an important factor of local medicinal knowledge loss, only affects people’s knowledge in the case of families who migrate over half of the year or permanently. Migration does not influence the knowledge of medicinal plants or the therapeutic strategies of families who migrate temporarily for shorter periods of time. Finally, economic status influences neither the status of people’s medicinal knowledge, nor families’ therapeutic strategies, even though the financial factor is often mentioned by practitioners and local people as the main reason for not using the formal health system. The influence of the formal health system on traditional medicinal knowledge varies in each case study area. In the Bolivian case, where it was only introduced in the 1990s and access to it is still very limited, the main impact was to give local communities access to contraceptive methods and to vaccination. In the Peruvian case, the formal system had a much greater impact on families’ health practices, due to local and national policies that, for instance, practically prohibit some traditional practices such as home birth. But in both cases, biomedicine is not considered capable of responding to cultural illnesses such as “fear” (susto), “bad air” (malviento), or “anger” (colerina). As a consequence, Andean farmers integrate the traditional medicinal system and the formal one within their multiple therapeutic strategies, reflecting an inter-ontological dialogue between different conceptions of health and illness. These findings reflect a more general trend in the Andes, where indigenous communities are currently actively revalorizing their knowledge and taking up traditional practices, thus strengthening their indigenous collective identities in a process of cultural resistance.
Resumo:
Structural and functional connectivity are intrinsic properties of the human brain and represent the amount of cognitive capacities of individual subjects. These connections are modulated due to development, learning, and disease. Momentary adaptations in functional connectivity alter the structural connections, which in turn affect the functional connectivity. Thus, structural and functional connectivity interact on a broad timescale. In this study, we aimed to explore distinct measures of connectivity assessed by functional magnetic resonance imaging and diffusion tensor imaging and their association to the dominant electroencephalogram oscillatory property at rest: the individual alpha frequency (IAF). We found that in 21 healthy young subjects, small intraindividual temporal IAF fluctuations were correlated to increased blood oxygenation level-dependent signal in brain areas associated to working memory functions and to the modulation of attention. These areas colocalized with functionally connected networks supporting the respective functions. Furthermore, subjects with higher IAF show increased fractional anisotropy values in fascicles connecting the above-mentioned areas and networks. Hence, due to a multimodal approach a consistent functionally and structurally connected network related to IAF was observed.
Resumo:
The topic of this study was to evaluate state-dependent effects of diazepam on the frequency characteristics of 47-channel spontaneous EEG maps. A novel method, the FFT-Dipole-Approximation (Lehmann and Michel, 1990), was used to study effects on the strength and the topography of the maps in the different frequency bands. Map topography was characterized by the 3-dimensional location of the equivalent dipole source and map strength was defined as the spatial standard deviation (the Global Field Power) of the maps of each frequency point. The Global Field Power can be considered as a measure of the amount of energy produced by the system, while the source location gives an estimate of the center of gravity of all sources in the brain that were active at a certain frequency. State-dependency was studied by evaluating the drug effects before and after a continuous performance task of 25 min duration. Clear interactions between drug (diazepam vs. placebo) and time after drug intake (before and after the task) were found, especially in the inferior-superior location of the dipole sources. It supports the hypothesis that diazepam, like other drugs, has different effects on brain functions depending on the momentary functional state of the brain. In addition to the drug effects, clearly different source locations and Global Field Power were found for the different frequency bands, replicating earlier reports (Michel et al., 1992).
Resumo:
BACKGROUND In 2007, leading international experts in the field of inflammatory bowel disease (IBD) recommended intravenous (IV) iron supplements over oral (PO) ones because of superior effectiveness and better tolerance. We aimed to determine the percentage of patients with IBD undergoing iron therapy and to assess the dynamics of iron prescription habits (IV versus PO). METHODS We analyzed anonymized data on patients with Crohn's disease and ulcerative colitis extracted from the Helsana database. Helsana is a Swiss health insurance company providing coverage for 18% of the Swiss population (1.2 million individuals). RESULTS In total, 629 patients with Crohn's disease (61% female) and 398 patients with ulcerative colitis (57% female) were identified; mean observation time was 31.8 months for Crohn's disease and 31.0 months for ulcerative colitis patients. Of all patients with IBD, 27.1% were prescribed iron (21.1% in males; 31.1% in females). Patients treated with steroids, immunomodulators, and/or anti-tumor necrosis factor drugs were more frequently treated with iron supplements when compared with those not treated with any medications (35.0% versus 20.9%, odds ratio, 1.94; P < 0.001). The frequency of IV iron prescriptions increased significantly from 2006 to 2009 for both genders (males: from 2.6% to 10.1%, odds ratio = 3.84, P < 0.001; females: from 5.3% to 12.1%, odds ratio = 2.26, P = 0.002), whereas the percentage of PO iron prescriptions did not change. CONCLUSIONS Twenty-seven percent of patients with IBD were treated with iron supplements. Iron supplements administered IV were prescribed more frequently over time. These prescription habits are consistent with the implementation of guidelines on the management of iron deficiency in IBD.
Resumo:
Background: The lymphocyte transformation test (LTT) is used for in vitro diagnosis of drug hypersensitivity reactions. While its specificity is over 90%, sensitivity is limited and depends on the type of reaction, drug and possibly time interval between the event and analysis. Removal of regulatory T cells (Treg/CD25(hi)) from in vitro stimulated cell cultures was previously reported to be a promising method to increase the sensitivity of proliferation tests. Objective: The aim of this investigation is to evaluate the effect of removal of regulatory T cells on the sensitivity of the LTT. Methods: Patients with well-documented drug hypersensitivity were recruited. Peripheral blood mononuclear cells, isolated CD3(+) and CD3(+) T cells depleted of the CD25(hi) fraction were used as effector cells in the LTT. Irrelevant drugs were also included to determine specificity. (3)H-thymidine incorporation was utilized as the detection system and results were expressed as a stimulation index (SI). Results: SIs of 7/11 LTTs were reduced after a mean time interval of 10.5 months (LTT 1 vs. LTT 2). Removal of the CD25(hi) fraction, which was FOXP3(+) and had a suppressive effect on drug-induced proliferation, resulted in an increased response to the relevant drugs. Sensitivity was increased from 25 to 82.35% with dramatically enhanced SI (2.05 to 6.02). Specificity was not affected. Conclusion: Removal of Treg/CD25(hi) cells can increase the frequency and strengths of drug-specific proliferation without affecting specificity. This approach might be useful in certain drug hypersensitivity reactions with borderline responses or long time interval since the hypersensitivity reaction. © 2014 S. Karger AG, Basel.
Resumo:
This study tests whether cognitive failures mediate effects of work-related time pressure and time control on commuting accidents and near-accidents. Participants were 83 employees (56% female) who each commuted between their regular place of residence and place of work using vehicles. The Workplace Cognitive Failure Scale (WCFS) asked for the frequency of failure in memory function, failure in attention regulation, and failure in action execution. Time pressure and time control at work were assessed by the Instrument for Stress Oriented Task Analysis (ISTA). Commuting accidents in the last 12 months were reported by 10% of participants, and half of the sample reported commuting near-accidents in the last 4 weeks. Cognitive failure significantly mediated the influence of time pressure at work on near-accidents even when age, gender, neuroticism, conscientiousness, commuting duration, commuting distance, and time pressure during commuting were controlled for. Time control was negatively related to cognitive failure and neuroticism, but no association with commuting accidents or near-accidents was found. Time pressure at work is likely to increase cognitive load. Time pressure might, therefore, increase cognitive failures during work and also during commuting. Hence, time pressure at work can decrease commuting safety. The result suggests a reduction of time pressure at work should improve commuting safety.
Resumo:
Repetitive transcranial magnetic stimulation (rTMS) is a novel research tool in neurology and psychiatry. It is currently being evaluated as a conceivable alternative to electroconvulsive therapy for the treatment of mood disorders. Eight healthy young (age range 21-25 years) right-handed men without sleep complaints participated in the study. Two sessions at a 1-week interval, each consisting of an adaptation night (sham stimulation) and an experimental night (rTMS in the left dorsolateral prefrontal cortex or sham stimulation; crossover design), were scheduled. In each subject, 40 trains of 2-s duration of rTMS (inter-train interval 28 s) were applied at a frequency of 20 Hz (i.e. 1600 pulses per session) and at an intensity of 90% of the motor threshold. Stimulations were scheduled 80 min before lights off. The waking EEG was recorded for 10-min intervals approximately 30 min prior to and after the 20-min stimulations, and polysomnographic recordings were obtained during the subsequent sleep episode (23.00-07.00 h). The power spectra of two referential derivations, as well as of bipolar derivations along the antero-posterior axis over the left and right hemispheres, were analyzed. rTMS induced a small reduction of sleep stage 1 (in min and percentage of total sleep time) over the whole night and a small enhancement of sleep stage 4 during the first non-REM sleep episode. Other sleep variables were not affected. rTMS of the left dorsolateral cortex did not alter the topography of EEG power spectra in waking following stimulation, in the all-night sleep EEG, or during the first non-REM sleep episode. Our results indicate that a single session of rTMS using parameters like those used in depression treatment protocols has no detectable side effects with respect to sleep in young healthy males.
Resumo:
Adding to the on-going debate regarding vegetation recolonisation (more particularly the timing) in Europe and climate change since the Lateglacial, this study investigates a long sediment core (LL081) from Lake Ledro (652ma.s.l., southern Alps, Italy). Environmental changes were reconstructed using multiproxy analysis (pollen-based vegetation and climate reconstruction, lake levels, magnetic susceptibility and X-ray fluorescence (XRF) measurements) recorded climate and land-use changes during the Lateglacial and early-middle Holocene. The well-dated and high-resolution pollen record of Lake Ledro is compared with vegetation records from the southern and northern Alps to trace the history of tree species distribution. An altitudedependent progressive time delay of the first continuous occurrence of Abies (fir) and of the Larix (larch) development has been observed since the Lateglacial in the southern Alps. This pattern suggests that the mid-altitude Lake Ledro area was not a refuge and that trees originated from lowlands or hilly areas (e.g. Euganean Hills) in northern Italy. Preboreal oscillations (ca. 11 000 cal BP), Boreal oscillations (ca. 10 200, 9300 cal BP) and the 8.2 kyr cold event suggest a centennial-scale climate forcing in the studied area. Picea (spruce) expansion occurred preferentially around 10 200 and 8200 cal BP in the south-eastern Alps, and therefore reflects the long-lasting cumulative effects of successive boreal and the 8.2 kyr cold event. The extension of Abies is contemporaneous with the 8.2 kyr event, but its development in the southern Alps benefits from the wettest interval 8200-7300 cal BP evidenced in high lake levels, flood activity and pollen-based climate reconstructions. Since ca. 7500 cal BP, a weak signal of pollen-based anthropogenic activities suggest weak human impact. The period between ca. 5700 and ca. 4100 cal BP is considered as a transition period to colder and wetter conditions (particularly during summers) that favoured a dense beech (Fagus) forest development which in return caused a distinctive yew (Taxus) decline.We conclude that climate was the dominant factor controlling vegetation changes and erosion processes during the early and middle Holocene (up to ca. 4100 cal BP).
Resumo:
The frequency of large-scale heavy precipitation events in the European Alps is expected to undergo substantial changes with current climate change. Hence, knowledge about the past natural variability of floods caused by heavy precipitation constitutes important input for climate projections. We present a comprehensive Holocene (10,000 years) reconstruction of the flood frequency in the Central European Alps combining 15 lacustrine sediment records. These records provide an extensive catalog of flood deposits, which were generated by flood-induced underflows delivering terrestrial material to the lake floors. The multi-archive approach allows suppressing local weather patterns, such as thunderstorms, from the obtained climate signal. We reconstructed mainly late spring to fall events since ice cover and precipitation in form of snow in winter at high-altitude study sites do inhibit the generation of flood layers. We found that flood frequency was higher during cool periods, coinciding with lows in solar activity. In addition, flood occurrence shows periodicities that are also observed in reconstructions of solar activity from C-14 and Be-10 records (2500-3000, 900-1200, as well as of about 710, 500, 350, 208 (Suess cycle), 150, 104 and 87 (Gleissberg cycle) years). As atmospheric mechanism, we propose an expansion/shrinking of the Hadley cell with increasing/decreasing air temperature, causing dry/wet conditions in Central Europe during phases of high/low solar activity. Furthermore, differences between the flood patterns from the Northern Alps and the Southern Alps indicate changes in North Atlantic circulation. Enhanced flood occurrence in the South compared to the North suggests a pronounced southward position of the Westerlies and/or blocking over the northern North Atlantic, hence resembling a negative NAO state (most distinct from 4.2 to 2.4 kyr BP and during the Little Ice Age). South-Alpine flood activity therefore provides a qualitative record of variations in a paleo-NAO pattern during the Holocene. Additionally, increased South Alpine flood activity contrasts to low precipitation in tropical Central America (Cariaco Basin) on the Holocene and centennial time scale. This observation is consistent with a Holocene southward migration of the Atlantic circulation system, and hence of the ITCZ, driven by decreasing summer insolation in the Northern hemisphere, as well as with shorter-term fluctuations probably driven by solar activity. (C) 2013 Elsevier Ltd. All rights reserved.