16 resultados para ORTHOGONAL FREQUENCY DIVISION MULTIPLEXING


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activation of midbrain dopamine systems is thought to be critically involved in the addictive properties of abused substances. Drugs of abuse increase dopamine release in the nucleus accumbens and dorsal striatum, which are the target areas of mesolimbic and nigrostriatal dopamine pathways, respectively. Dopamine release in the nucleus accumbens is thought to mediate the attribution of incentive salience to rewards, and dorsal striatal dopamine release is involved in habit formation. In addition, changes in the function of prefrontal cortex (PFC), the target area of mesocortical dopamine pathway, may skew information processing and memory formation such that the addict pays an abnormal amount of attention to drug-related cues. In this study, we wanted to explore how long-term forced oral nicotine exposure or the lack of catechol-O-methyltransferase (COMT), one of the dopamine metabolizing enzymes, would affect the functioning of these pathways. We also wanted to find out how the forced nicotine exposure or the lack of COMT would affect the consumption of nicotine, alcohol, or cocaine. First, we studied the effect of forced chronic nicotine exposure on the sensitivity of dopamine D2-like autoreceptors in microdialysis and locomotor activity experiments. We found that the sensitivity of these receptors was unchanged after forced oral nicotine exposure, although an increase in the sensitivity was observed in mice treated with intermittent nicotine injections twice daily for 10 days. Thus, the effect of nicotine treatment on dopamine autoreceptor sensitivity depends on the route, frequency, and time course of drug administration. Second, we investigated whether the forced oral nicotine exposure would affect the reinforcing properties of nicotine injections. The chronic nicotine exposure did not significantly affect the development of conditioned place preference to nicotine. In the intravenous self-administration paradigm, however, the nicotine-exposed animals self-administered nicotine at a lower unit dose than the control animals, indicating that their sensitivity to the reinforcing effects of nicotine was enhanced. Next, we wanted to study whether the Comt gene knock-out animals would be a suitable model to study alcohol and cocaine consumption or addiction. Although previous work had shown male Comt knock-out mice to be less sensitive to the locomotor-activating effects of cocaine, the present study found that the lack of COMT did not affect the consumption of cocaine solutions or the development of cocaine-induced place preference. However, the present work did find that male Comt knock-out mice, but not female knock-out mice, consumed ethanol more avidly than their wild-type littermates. This finding suggests that COMT may be one of the factors, albeit not a primary one, contributing to the risk of alcoholism. Last, we explored the effect of COMT deficiency on dorsal striatal, accumbal, and prefrontal cortical dopamine metabolism under no-net-flux conditions and under levodopa load in freely-moving mice. The lack of COMT did not affect the extracellular dopamine concentrations under baseline conditions in any of the brain areas studied. In the prefrontal cortex, the dopamine levels remained high for a prolonged time after levodopa treatment in male, but not female, Comt knock-out mice. COMT deficiency induced accumulation of 3,4-dihydroxyphenylacetic acid, which increased further under levodopa load. Homovanillic acid was not detectable in Comt knock-out animals either under baseline conditions or after levodopa treatment. Taken together, the present results show that although forced chronic oral nicotine exposure affects the reinforcing properties of self-administered nicotine, it is not an addiction model itself. COMT seems to play a minor role in dopamine metabolism and in the development of addiction under baseline conditions, indicating that dopamine function in the brain is well-protected from perturbation. However, the role of COMT becomes more important when the dopaminergic system is challenged, such as by pharmacological manipulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The synchronization of neuronal activity, especially in the beta- (14-30 Hz) /gamma- (30 80 Hz) frequency bands, is thought to provide a means for the integration of anatomically distributed processing and for the formation of transient neuronal assemblies. Thus non-stimulus locked (i.e. induced) gamma-band oscillations are believed to underlie feature binding and the formation of neuronal object representations. On the other hand, the functional roles of neuronal oscillations in slower theta- (4 8 Hz) and alpha- (8 14 Hz) frequency bands remain controversial. In addition, early stimulus-locked activity has been largely ignored, as it is believed to reflect merely the physical properties of sensory stimuli. With human neuromagnetic recordings, both the functional roles of gamma- and alpha-band oscillations and the significance of early stimulus-locked activity in neuronal processing were examined in this thesis. Study I of this thesis shows that even the stimulus-locked (evoked) gamma oscillations were sensitive to high-level stimulus features for speech and non-speech sounds, suggesting that they may underlie the formation of early neuronal object representations for stimuli with a behavioural relevance. Study II shows that neuronal processing for consciously perceived and unperceived stimuli differed as early as 30 ms after stimulus onset. This study also showed that the alpha band oscillations selectively correlated with conscious perception. Study III, in turn, shows that prestimulus alpha-band oscillations influence the subsequent detection and processing of sensory stimuli. Further, in Study IV, we asked whether phase synchronization between distinct frequency bands is present in cortical circuits. This study revealed prominent task-sensitive phase synchrony between alpha and beta/gamma oscillations. Finally, the implications of Studies II, III, and IV to the broader scientific context are analysed in the last study of this thesis (V). I suggest, in this thesis that neuronal processing may be extremely fast and that the evoked response is important for cognitive processes. I also propose that alpha oscillations define the global neuronal workspace of perception, action, and consciousness and, further, that cross-frequency synchronization is required for the integration of neuronal object representations into global neuronal workspace.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Class II division 1 malocclusion occurs in 3.5 to 13 percent of 7 12 year-old children. It is the most common reason for orthodontic treatment in Finland. Correction is most commonly performed using headgear treatment. The aim of this study was to investigate the effects of cervical headgear treatment on dentition, facial skeletal and soft tissue growth, and upper airway structure, in children. 65 schoolchildren, 36 boys and 29 girls were studied. At the onset of treatment a mean age was 9.3 (range 6.6 12.4) years. All the children were consequently referred to an orthodontist because of Class II division 1 malocclusion. The included children had protrusive maxilla and an overjet of more than 2mm (3 to 11 mm). The children were treated with a Kloehn-type cervical headgear as the only appliance until Class I first molar relationships were achieved. The essential features of the headgear were cervical strong pulling forces, a long upward bent outer bow, and an expanded inner bow. Dental casts and lateral and posteroanterior cephalograms were taken before and after the treatment. The results were compared to a historical, cross-sectional Finnish cohort or to historical, age- and sex-matched normal Class I controls. The Class I first molar relationships were achieved in all the treated children. The mean treatment time was 1.7 (range 0.3-3.1) years. Phase 2 treatments were needed in 52% of the children, most often because of excess overjet or overbite. The treatment decreased maxillary protrusion by inhibiting alveolar forward growth, while the rest of the maxilla and mandible followed normal growth. The palate rotated anteriorly downward. The expansion of the inner bow of the headgear induced widening of the maxilla, nasal cavity, and the upper and lower dental arches. Class II malocclusion was associated with narrower oro- and hypopharyngeal space than in the Class I normal controls. The treatment increased the retropalatal airway space, while the rest of the airway remained unaffected. The facial profile improved esthetically, while the facial convexity decreased. Facial soft tissues masked the facial skeletal convexity, and the soft tissue changes were smaller than skeletal changes. In conclusion, the headgear treatment with the expanded inner bow may be used as an easy and simple method for Class II correction in growing children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chronic myeloid leukemia (CML) is a malignant clonal blood disease that originates from a pluripotent hematopoietic stem cell. The cytogenetic hallmark of CML, the Philadelphia chromosome (Ph), is formed as a result of reciprocal translocation between chromosomes 9 and 22, which leads to a formation of a chimeric BCR-ABL fusion gene. The BCR-ABL protein is a constitutively active tyrosine kinase that changes the adhesion properties of cells, constitutively activates mitogenic signaling, enhances cell proliferation and reduces apoptosis. This results in leukemic growth and the clinical disease, CML. With the advent of targeted therapies against the BCR-ABL fusion protein, the treatment of CML has changed considerably during the recent decade. In this thesis, the clinical significance of different diagnostic methods and new prognostic factors in CML have been assessed. First, the association between two different methods for measuring CML disease burden (the RQ-PCR and the high mitotic index metaphase FISH) was assessed in bone marrow and peripheral blood samples. The correlation between positive RQ-PCR and metaphase FISH samples was high. However, RQ-PCR was more sensitive and yielded measurable transcripts in 40% of the samples that were negative by metaphase FISH. The study established a laboratory-specific conversion factor for setting up the International Scale when standardizing RQ-PCR measurements. Secondly, the amount of minimal residual disease (MRD) after allogeneic hematopoietic stem cell transplantation (alloHSCT) was determined. For this, metaphase FISH was done for the bone marrow samples of 102 CML patients. Most (68%), had no residual cells during the entire follow-up time. Some (12 %) patients had minor (<1%) MRD which decreased even further with time, whereas 19% had a progressive rise in MRD that exceeded 1% or had more than 1% residual cells when first detected. Residual cells did not become eradicated spontaneously if the frequency of Ph+ cells exceeded 1% during follow-up. Next, the impact of deletions in the derivative chromosome 9, was examined. Deletions were observed in 15% of the CML patients who later received alloHSCT. After alloHSCT, there was no difference in the total relapse rate in patients with or without deletions. Nor did the estimates of overall survival, transplant-related mortality, leukemia-free survival and relapse-free time show any difference between these groups. When conventional treatment regimens are used, the der(9) status could be an important criterion, in conjunction with other prognostic factors, when allogeneic transplantation is considered. The significance of der(9) deletions for patients treated with tyrosine kinase inhibitors is not clear and requires further investigation. In addition to the der(9) status of the patient, the significance of bone marrow lymphocytosis as a prognostic factor in CML was assessed. Bone marrow lymphocytosis during imatinib therapy was a positive predictive factor and heralded optimal response. When combined with major cytogenetic response at three months of treatment, bone marrow lymphocytosis predicted a prognostically important major molecular response at 18 months of imatinib treatment. Although the validation of these findings is warranted, the determination of the bone marrow lymphocyte count could be included in the evaluation of early response to imatinib treatment already now. Finally, BCR-ABL kinase domain mutations were studied in CML patients resistant against imatinib treatment. Point mutations detected in the kinase domain were the same as previously reported, but other sequence variants, e.g. deletions or exon splicing, were also found. The clinical significance of the other variations remains to be determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inadvertent climate modification has led to an increase in urban temperatures compared to the surrounding rural area. The main reason for the temperature rise is the altered energy portioning of input net radiation to heat storage and sensible and latent heat fluxes in addition to the anthropogenic heat flux. The heat storage flux and anthropogenic heat flux have not yet been determined for Helsinki and they are not directly measurable. To the contrary, turbulent fluxes of sensible and latent heat in addition to net radiation can be measured, and the anthropogenic heat flux together with the heat storage flux can be solved as a residual. As a result, all inaccuracies in the determination of the energy balance components propagate to the residual term and special attention must be paid to the accurate determination of the components. One cause of error in the turbulent fluxes is the fluctuation attenuation at high frequencies which can be accounted for by high frequency spectral corrections. The aim of this study is twofold: to assess the relevance of high frequency corrections to water vapor fluxes and to assess the temporal variation of the energy fluxes. Turbulent fluxes of sensible and latent heat have been measured at SMEAR III station, Helsinki, since December 2005 using the eddy covariance technique. In addition, net radiation measurements have been ongoing since July 2007. The used calculation methods in this study consist of widely accepted eddy covariance data post processing methods in addition to Fourier and wavelet analysis. The high frequency spectral correction using the traditional transfer function method is highly dependent on relative humidity and has an 11% effect on the latent heat flux. This method is based on an assumption of spectral similarity which is shown not to be valid. A new correction method using wavelet analysis is thus initialized and it seems to account for the high frequency variation deficit. Anyhow, the resulting wavelet correction remains minimal in contrast to the traditional transfer function correction. The energy fluxes exhibit a behavior characteristic for urban environments: the energy input is channeled to sensible heat as latent heat flux is restricted by water availability. The monthly mean residual of the energy balance ranges from 30 Wm-2 in summer to -35 Wm-2 in winter meaning a heat storage to the ground during summer. Furthermore, the anthropogenic heat flux is approximated to be 50 Wm-2 during winter when residential heating is important.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mediastinitis as a complication after cardiac surgery is rare but disastrous increasing the hospital stay, hospital costs, morbidity and mortality. It occurs in 1-3 % of patients after median sternotomy. The purpose of this study was to find out the risk factors and also to investigate new ways to prevent mediastinitis. First, we assessed operating room air contamination monitoring by comparing the bacteriological technique with continuous particle counting in low level contamination achieved by ultra clean garment options in 66 coronary artery bypass grafting operations. Second, we examined surgical glove perforations and the changes in bacterial flora of surgeons' fingertips in 116 open-heart operations. Third, the effect of gentamicin-collagen sponge on preventing surgical site infections (SSI) was studied in randomized controlled study with 557 participants. Finally, incidence, outcome, and risk factors of mediastinitis were studied in over 10,000 patients. With the alternative garment and textile system (cotton group and clean air suit group), the air counts fell from 25 to 7 colony-forming units/m3 (P<0.01). The contamination of the sternal wound was reduced by 46% and that of the leg wound by >90%. In only 17% operations both gloves were found unpunctured. Frequency of glove perforations and bacteria counts of hands were found to increase with operation time. With local gentamicin prophylaxis slightly less SSIs (4.0 vs. 5.9%) and mediastinitis (1.1 vs. 1.9%) occurred. We identified 120/10713 cases of postoperative mediastinitis (1.1%). During the study period, the patient population grew significantly older, the proportion of women and patients with ASA score >3 increased significantly. In multivariate logistic regression analysis, the only significant predictor for mediastinitis was obesity. Continuous particle monitoring is a good intraoperative method to control the air contamination related to the theatre staff behavior during individual operation. When a glove puncture is detected, both gloves are to be changed. Before donning a new pair of gloves, the renewed disinfection of hands will help to keep their bacterial counts lower even towards the end of long operation. Gentamicin-collagen sponge may have beneficial effects on the prevention of SSI, but further research is needed. Mediastinitis is not diminishing. Larger populations at risk, for example proportions of overweight patients, reinforce the importance of surveillance and pose a challenge in focusing preventive measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Staphylococcus aureus is the second most common bloodstream isolate both in community- and hospital-acquired bacteremias. The clinical course of S. aureus bacteremia (SAB) is determined by its complications, particularly by the development of deep infections and thromboembolic events. Despite the progress of antimicrobial therapy, SAB is still associated with high mortality. However, injection drug users (IDUs) tend to have fewer complications and better prognosis than nonaddicts, especially in endocarditis. The present study was undertaken to investigate epidemiology, treatment and outcome of S. aureus bacteremia and endocarditis in Finland. In particular, differences in bacterial strains and their virulence factors, and host immune responses were compared between IDUs and nonaddicts. In Finland, 5045 SAB cases during 1995-2001 were included using the National Infectious Disease Register maintained by National Public Health Institute. The annual incidence of SAB increased, especially in elderly. While the increase in incidence may partly be explained by better reporting, it most likely reflects a growing population at risk, affected by such factors as age and/or severe comorbidity. Nosocomial infections accounted for 51% of cases, with no change in their proportion during the study period. The 28-day mortality was 17% and remained unchanged over time. A total of 381 patients with SAB were randomized to receive either standard antibiotic treatment or levofloxacin added to standard treatment. Levofloxacin combination therapy did not decrease the mortality, lower the incidence of deep infections, nor did it speed up the recovery during 3 months follow-up. However, patients with a deep infection appeared to benefit from combination therapy with rifampicin, as suggested also by experimental data. Deep infections were found in 84% of SAB patients within one week after randomization, and they appeared to be more common than previously reported. Endocarditis was observed in 74 of 430 patients (17%) with SAB, of whom 20 were IDUs and 54 nonaddicts. Right-sided involvement was diagnosed in 60% of addicts whereas 93% of nonaddicts had left-sided endocarditis. Unexpectedly, IDUs showed extracardiac deep infections, thromboembolic events and severe sepsis with the same frequency as nonaddicts. The prognosis of endocarditis was better among addicts due to their younger age and lack of underlying diseases in agreement with earlier reports. In total, all 44 IDUs with SAB were included and 20 of them had endocarditis. An equal number of nonaddicts with SAB were chosen as group matched controls. Serological tests were not helpful in identifying patients with a deep infection. No individual S. aureus strain dominated in endocarditis among addicts. Characterization of the virulence factors of bacterial strains did not reveal any significant differences in IDUs and nonaddicts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation deals with remote narrowband measurements of the electromagnetic radiation emitted by lightning flashes. A lightning flash consists of a number of sub-processes. The return stroke, which transfers electrical charge from the thundercloud to to the ground, is electromagnetically an impulsive wideband process; that is, it emits radiation at most frequencies in the electromagnetic spectrum, but its duration is only some tens of microseconds. Before and after the return stroke, multiple sub-processes redistribute electrical charges within the thundercloud. These sub-processes can last for tens to hundreds of milliseconds, many orders of magnitude longer than the return stroke. Each sub-process causes radiation with specific time-domain characteristics, having maxima at different frequencies. Thus, if the radiation is measured at a single narrow frequency band, it is difficult to identify the sub-processes, and some sub-processes can be missed altogether. However, narrowband detectors are simple to design and miniaturize. In particular, near the High Frequency band (High Frequency, 3 MHz to 30 MHz), ordinary shortwave radios can, in principle, be used as detectors. This dissertation utilizes a prototype detector which is essentially a handheld AM radio receiver. Measurements were made in Scandinavia, and several independent data sources were used to identify lightning sub-processes, as well as the distance to each individual flash. It is shown that multiple sub-processes radiate strongly near the HF band. The return stroke usually radiates intensely, but it cannot be reliably identified from the time-domain signal alone. This means that a narrowband measurement is best used to characterize the energy of the radiation integrated over the whole flash, without attempting to identify individual processes. The dissertation analyzes the conditions under which this integrated energy can be used to estimate the distance to the flash. It is shown that flash-by-flash variations are large, but the integrated energy is very sensitive to changes in the distance, dropping as approximately the inverse cube root of the distance. Flashes can, in principle, be detected at distances of more than 100 km, but since the ground conductivity can vary, ranging accuracy drops dramatically at distances larger than 20 km. These limitations mean that individual flashes cannot be ranged accurately using a single narrowband detector, and the useful range is limited to 30 kilometers at the most. Nevertheless, simple statistical corrections are developed, which enable an accurate estimate of the distance to the closest edge of an active storm cell, as well as the approach speed. The results of the dissertation could therefore have practical applications in real-time short-range lightning detection and warning systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aerosol particles in the atmosphere are known to significantly influence ecosystems, to change air quality and to exert negative health effects. Atmospheric aerosols influence climate through cooling of the atmosphere and the underlying surface by scattering of sunlight, through warming of the atmosphere by absorbing sun light and thermal radiation emitted by the Earth surface and through their acting as cloud condensation nuclei. Aerosols are emitted from both natural and anthropogenic sources. Depending on their size, they can be transported over significant distances, while undergoing considerable changes in their composition and physical properties. Their lifetime in the atmosphere varies from a few hours to a week. New particle formation is a result of gas-to-particle conversion. Once formed, atmospheric aerosol particles may grow due to condensation or coagulation, or be removed by deposition processes. In this thesis we describe analyses of air masses, meteorological parameters and synoptic situations to reveal conditions favourable for new particle formation in the atmosphere. We studied the concentration of ultrafine particles in different types of air masses, and the role of atmospheric fronts and cloudiness in the formation of atmospheric aerosol particles. The dominant role of Arctic and Polar air masses causing new particle formation was clearly observed at Hyytiälä, Southern Finland, during all seasons, as well as at other measurement stations in Scandinavia. In all seasons and on multi-year average, Arctic and North Atlantic areas were the sources of nucleation mode particles. In contrast, concentrations of accumulation mode particles and condensation sink values in Hyytiälä were highest in continental air masses, arriving at Hyytiälä from Eastern Europe and Central Russia. The most favourable situation for new particle formation during all seasons was cold air advection after cold-front passages. Such a period could last a few days until the next front reached Hyytiälä. The frequency of aerosol particle formation relates to the frequency of low-cloud-amount days in Hyytiälä. Cloudiness of less than 5 octas is one of the factors favouring new particle formation. Cloudiness above 4 octas appears to be an important factor that prevents particle growth, due to the decrease of solar radiation, which is one of the important meteorological parameters in atmospheric particle formation and growth. Keywords: Atmospheric aerosols, particle formation, air mass, atmospheric front, cloudiness

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current study is a longitudinal investigation into changes in the division of household labour across transitions to marriage and parenthood in the UK. Previous research has noted a more traditional division of household labour, with women performing the majority of housework, amongst spouses and couples with children. However, the bulk of this work has been cross-sectional in nature. The few longitudinal studies that have been carried out have been rather ambiguous about the effect of marriage and parenthood on the division of housework. Theoretically, this study draws on gender construction theory. The key premise of this theory is that gender is something that is performed and created in interaction, and, as a result, something fluid and flexible rather than fixed and stable. The idea that couples ‘do gender’ through housework has been a major theoretical breakthrough. Gender-neutral explanations of the division of household labour, positing rational acting individuals, have failed to explicate why women continue to perform an unequal share of housework, regardless of socio-economic status. Contrastingly, gender construction theory situates gender as the key process in dividing household labour. By performing and avoiding certain housework chores, couples fulfill social norms of what it means to be a man and a woman although, given the emphasis on human agency in producing and contesting gender, couples are able to negotiate alternative gender roles which, in turn, feed back into the structure of social norms in an ever-changing societal landscape. This study adds extra depth to the doing gender approach by testing whether or not couples negotiate specific conjugal and parent roles in terms of the division of household labour. Both transitions hypothesise a more traditional division of household labour. Data comes from the British Household Panel Survey, a large, nationally representative quantitative survey that has been carried out annually since 1991. Here, data tracks the same 776 couples at two separate time points – 1996 and 2005. OLS regression is used to test whether or not transitions to marriage and parenthood have a significant impact on the division of household labour whilst controlling for host of relevant socio-economic factors. Results indicate that marriage has no significant effect on how couples partition housework. Those couples making the transition from cohabitation to marriage do not show significant changes in housework arrangements from those couples who remain cohabiting in both waves. On the other hand, becoming parents does lead to a more traditional division of household labour whilst controlling for socio-economic factors which accompany the move to parenthood. There is then some evidence that couples use the site of household labour to ‘do parenthood’ and generate identities which both use and inform socially prescribed notions of what it means to be a mother and a father. Support for socio-economic explanations of the division of household labour was mixed although it remains clear that they, alone, cannot explain how households divide housework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines empirically the effect firm reputation has on the determinants of debt maturity. Utilising data from European primary bond market between 1999 and 2005, I find that the maturity choice of issuers with a higher reputation is less sensitive to macroeconomic conditions, market credit risk-premiums, prevailing firm credit quality and size of the debt issue. The annualised coupon payments are shown to be a significant factor in determining the debt maturity and reveal a monotonously increasing relationship between credit quality and debt maturity once controlled for. Finally, I show that issuers lacking a credit rating have an implied credit quality positioned between investment-grade and speculative-grade debt.