573 resultados para yleinen hyvä


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airway inflammation is a key feature of bronchial asthma. In asthma management, according to international guidelines, the gold standard is anti-inflammatory treatment. Currently, only conventional procedures (i.e., symptoms, use of rescue medication, PEF-variability, and lung function tests) were used to both diagnose and evaluate the results of treatment with anti-inflammatory drugs. New methods for evaluation of degree of airway inflammation are required. Nitric oxide (NO) is a gas which is produced in the airways of healthy subjects and especially produced in asthmatic airways. Measurement of NO from the airways is possible, and NO can be measured from exhaled air. Fractional exhaled NO (FENO) is increased in asthma, and the highest concentrations are measured in asthmatic patients not treated with inhaled corticosteroids (ICS). Steroid-treated patients with asthma had levels of FENO similar to those of healthy controls. Atopic asthmatics had higher levels of FENO than did nonatopic asthmatics, indicating that level of atopy affected FENO level. Associations between FENO and bronchial hyperresponsiveness (BHR) occur in asthma. The present study demonstrated that measurement of FENO had good reproducibility, and the FENO variability was reasonable both short- and long-term in both healthy subjects and patients with respiratory symptoms or asthma. We demonstrated the upper normal limit for healthy subjects, which was 12 ppb calculated from two different healthy study populations. We showed that patients with respiratory symptoms who did not fulfil the diagnostic criteria of asthma had FENO values significantly higher than in healthy subjects, but significantly lower than in asthma patients. These findings suggest that BHR to histamine is a sensitive indicator of the effect of ICS and a valuable tool for adjustment of corticosteroid treatment in mild asthma. The findings further suggest that intermittent treatment periods of a few weeks’ duration are insufficient to provide long-term control of BHR in patients with mild persistent asthma. Moreover, during the treatment with ICS changes in BHR and changes in FENO were associated. FENO level was associated with BHR measured by a direct (histamine challenge) or indirect method (exercise challenge) in steroid-naïve symptomatic, non-smoking asthmatics. Although these associations could be found only in atopics, FENO level in nonatopic asthma was also increased. It can thus be concluded that assessment of airway inflammation by measuring FENO can be useful for clinical purposes. The methodology of FENO measurements is now validated. Especially in those patients with respiratory symptoms who did not fulfil the diagnostic criteria of asthma, FENO measurement can aid in treatment decisions. Serial measurement of FENO during treatment with ICS can be a complementary or an alternative method for evaluation in patients with asthma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a study of the x-ray scattering properties of tissues and tumours of the breast. Clinical radiography is based on the absorption of the x-rays when passing right through the human body and gives information about the densities of the tissues. Besides being absorbed, x-rays may change their direction within the tissues due to elastic scattering or even to refraction. The phenomenon of scattering is a nuisance to radiography in general, and to mammography in particular, because it reduces the quality of the images. However, scattered x-rays bear very useful information about the structure of the tissues at the supra-molecular level. Some pathologies, like breast cancer, produce alterations to the structures of the tissues, being especially evident in collagen-rich tissues. On the other hand, the change of direction due to refraction of the x-rays on the tissue boundaries can be mapped. The diffraction enhanced imaging (DEI) technique uses a perfect crystal to convert the angular deviations of the x-rays into intensity variations, which can be recorded as images. This technique is of especial interest in the cases were the densities of the tissues are very similar (like in mammography) and the absorption images do not offer enough contrast. This thesis explores the structural differences existing in healthy and pathological collagen in breast tissue samples by the small-angle x-ray scattering (SAXS) technique and compares these differences with the morphological information found in the DEI images and the histo-pathology of the same samples. Several breast tissue samples were studied by SAXS technique in the European Synchrotron Radiation Facility (ESRF) in Grenoble, France. Scattering patterns of the different tissues of the breast were acquired and compared with the histology of the samples. The scattering signals from adipose tissue (fat), connective tissue (collagen) and necrotic tissue were identified. Moreover, a clear distinction could be done between the scattering signals from healthy collagen and from collagen from an invasive tumour. Scattering from collagen is very characteristic. It includes several scattering peaks and scattering features that carry information about the size and the spacing of the collagen fibrils in the tissues. It was found that the collagen fibrils in invaded tumours were thinner and had a d-spacing length 0,7% longer that fibrils from healthy tumours. The scattering signals from the breast tissues were compared with the histology by building colour-coded maps across the samples. They were also imaged with the DEI technique. There was a total agreement between the scattering maps, the morphological features seen in the images and the information of the histo- pathological examination. The thesis demonstrates that the x-ray scattering signal can be used to characterize tissues and that it carries important information about the pathological state of the breast tissues, thus showing the potential of the SAXS technique as a possible diagnostic tool for breast cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Differentiation of various types of soft tissues is of high importance in medical imaging, because changes in soft tissue structure are often associated with pathologies, such as cancer. However, the densities of different soft tissues may be very similar, making it difficult to distinguish them in absorption images. This is especially true when the consideration of patient dose limits the available signal-to-noise ratio. Refraction is more sensitive than absorption to changes in the density, and small angle x-ray scattering on the other hand contains information about the macromolecular structure of the tissues. Both of these can be used as potential sources of contrast when soft tissues are imaged, but little is known about the visibility of the signals in realistic imaging situations. In this work the visibility of small-angle scattering and refraction in the context of medical imaging has been studied using computational methods. The work focuses on the study of analyzer based imaging, where the information about the sample is recorded in the rocking curve of the analyzer crystal. Computational phantoms based on simple geometrical shapes with differing material properties are used. The objects have realistic dimensions and attenuation properties that could be encountered in real imaging situations. The scattering properties mimic various features of measured small-angle scattering curves. Ray-tracing methods are used to calculate the refraction and attenuation of the beam, and a scattering halo is accumulated, including the effect of multiple scattering. The changes in the shape of the rocking curve are analyzed with different methods, including diffraction enhanced imaging (DEI), extended DEI (E-DEI) and multiple image radiography (MIR). A wide angle DEI, called W-DEI, is introduced and its performance is compared with that of the established methods. The results indicate that the differences in scattered intensities from healthy and malignant breast tissues are distinguishable to some extent with reasonable dose. Especially the fraction of total scattering has large enough differences that it can serve as a useful source of contrast. The peaks related to the macromolecular structure come to angles that are rather large, and have intensities that are only a small fraction of the total scattered intensity. It is found that such peaks seem to have only limited usefulness in medical imaging. It is also found that W-DEI performs rather well when most of the intensity remains in the direct beam, indicating that dark field imaging methods may produce the best results when scattering is weak. Altogether, it is found that the analysis of scattered intensity is a viable option even in medical imaging where the patient dose is the limiting factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solar ultraviolet (UV) radiation has a broad range of effects concerning life on Earth. Soon after the mid-1980s, it was recognized that the stratospheric ozone content was declining over large areas of the globe. Because the stratospheric ozone layer protects life on Earth from harmful UV radiation, this lead to concern about possible changes in the UV radiation due to anthropogenic activity. Initiated by this concern, many stations for monitoring of the surface UV radiation were founded in the late 1980s and early 1990s. As a consequence, there is an apparent lack of information on UV radiation further in the past: measurements cannot tell us how the UV radiation levels have changed on time scales of, for instance, several decades. The aim of this thesis was to improve our understanding of past variations in the surface UV radiation by developing techniques for UV reconstruction. Such techniques utilize commonly available meteorological data together with measurements of the total ozone column for reconstructing, or estimating, the amount of UV radiation reaching Earth's surface in the past. Two different techniques for UV reconstruction were developed. Both are based on first calculating the clear-sky UV radiation using a radiative transfer model. The clear-sky value is then corrected for the effect of clouds based on either (i) sunshine duration or (ii) pyranometer measurements. Both techniques account also for the variations in the surface albedo caused by snow, whereas aerosols are included as a typical climatological aerosol load. Using these methods, long time series of reconstructed UV radiation were produced for five European locations, namely Sodankylä and Jokioinen in Finland, Bergen in Norway, Norrköping in Sweden, and Davos in Switzerland. Both UV reconstruction techniques developed in this thesis account for the greater part of the factors affecting the amount of UV radiation reaching the Earth's surface. Thus, they are considered reliable and trustworthy, as suggested also by the good performance of the methods. The pyranometer-based method shows better performance than the sunshine-based method, especially for daily values. For monthly values, the difference between the performances of the methods is smaller, indicating that the sunshine-based method is roughly as good as the pyranometer-based for assessing long-term changes in the surface UV radiation. The time series of reconstructed UV radiation produced in this thesis provide new insight into the past UV radiation climate and how the UV radiation has varied throughout the years. Especially the sunshine-based UV time series, extending back to 1926 and 1950 at Davos and Sodankylä, respectively, also put the recent changes driven by the ozone decline observed over the last few decades into perspective. At Davos, the reconstructed UV over the period 1926-2003 shows considerable variation throughout the entire period, with high values in the mid-1940s, early 1960s, and in the 1990s. Moreover, the variations prior to 1980 were found to be caused primarily by variations in the cloudiness, while the increase of 4.5 %/decade over the period 1979-1999 was supported by both the decline in the total ozone column and changes in the cloudiness. Of the other stations included in this work, both Sodankylä and Norrköping show a clear increase in the UV radiation since the early 1980s (3-4 %/decade), driven primarily by changes in the cloudiness, and to a lesser extent by the diminution of the total ozone. At Jokioinen, a weak increase was found, while at Bergen there was no considerable overall change in the UV radiation level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cosmological observations of light from type Ia supernovae, the cosmic microwave background and the galaxy distribution seem to indicate that the expansion of the universe has accelerated during the latter half of its age. Within standard cosmology, this is ascribed to dark energy, a uniform fluid with large negative pressure that gives rise to repulsive gravity but also entails serious theoretical problems. Understanding the physical origin of the perceived accelerated expansion has been described as one of the greatest challenges in theoretical physics today. In this thesis, we discuss the possibility that, instead of dark energy, the acceleration would be caused by an effect of the nonlinear structure formation on light, ignored in the standard cosmology. A physical interpretation of the effect goes as follows: due to the clustering of the initially smooth matter with time as filaments of opaque galaxies, the regions where the detectable light travels get emptier and emptier relative to the average. As the developing voids begin to expand the faster the lower their matter density becomes, the expansion can then accelerate along our line of sight without local acceleration, potentially obviating the need for the mysterious dark energy. In addition to offering a natural physical interpretation to the acceleration, we have further shown that an inhomogeneous model is able to match the main cosmological observations without dark energy, resulting in a concordant picture of the universe with 90% dark matter, 10% baryonic matter and 15 billion years as the age of the universe. The model also provides a smart solution to the coincidence problem: if induced by the voids, the onset of the perceived acceleration naturally coincides with the formation of the voids. Additional future tests include quantitative predictions for angular deviations and a theoretical derivation of the model to reduce the required phenomenology. A spin-off of the research is a physical classification of the cosmic inhomogeneities according to how they could induce accelerated expansion along our line of sight. We have identified three physically distinct mechanisms: global acceleration due to spatial variations in the expansion rate, faster local expansion rate due to a large local void and biased light propagation through voids that expand faster than the average. A general conclusion is that the physical properties crucial to account for the perceived acceleration are the growth of the inhomogeneities and the inhomogeneities in the expansion rate. The existence of these properties in the real universe is supported by both observational data and theoretical calculations. However, better data and more sophisticated theoretical models are required to vindicate or disprove the conjecture that the inhomogeneities are responsible for the acceleration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The superconducting (or cryogenic) gravimeter (SG) is based on the levitation of a super­conducting sphere in a stable magnetic field created by current in superconducting coils. Depending on frequency, it is capable of detecting gravity variations as small as 10-11ms-2. For a single event, the detection threshold is higher, conservatively about 10-9 ms-2. Due to its high sensitivity and low drift rate, the SG is eminently suitable for the study of geodynamical phenomena through their gravity signatures. I present investigations of Earth dynamics with the superconducting gravimeter GWR T020 at Metsähovi from 1994 to 2005. The history and key technical details of the installation are given. The data processing methods and the development of the local tidal model at Metsähovi are presented. The T020 is a part of the worldwide GGP (Global Geodynamics Project) network, which consist of 20 working station. The data of the T020 and of other participating SGs are available to the scientific community. The SG T020 have used as a long-period seismometer to study microseismicity and the Earth s free oscillation. The annual variation, spectral distribution, amplitude and the sources of microseism at Metsähovi were presented. Free oscillations excited by three large earthquakes were analyzed: the spectra, attenuation and rotational splitting of the modes. The lowest modes of all different oscillation types are studied, i.e. the radial mode 0S0, the "football mode" 0S2, and the toroidal mode 0T2. The very low level (0.01 nms-1) incessant excitation of the Earth s free oscillation was detected with the T020. The recovery of global and regional variations in gravity with the SG requires the modelling of local gravity effects. The most important of them is hydrology. The variation in the groundwater level at Metsähovi as measured in a borehole in the fractured bedrock correlates significantly (0.79) with gravity. The influence of local precipitation, soil moisture and snow cover are detectable in the gravity record. The gravity effect of the variation in atmospheric mass and that of the non-tidal loading by the Baltic Sea were investigated together, as sea level and air pressure are correlated. Using Green s functions it was calculated that a 1 metre uniform layer of water in the Baltic Sea increases the gravity at Metsähovi by 31 nms-2 and the vertical deformation is -11 mm. The regression coefficient for sea level is 27 nms-2m-1, which is 87% of the uniform model. These studies are associated with temporal height variations using the GPS data of Metsähovi permanent station. Results of long time series at Metsähovi demonstrated high quality of data and correctly carried out offsets and drift corrections. The superconducting gravimeter T020 has been proved to be an eminent and versatile tool in studies of the Earth dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Einstein's general relativity is a classical theory of gravitation: it is a postulate on the coupling between the four-dimensional, continuos spacetime and the matter fields in the universe, and it yields their dynamical evolution. It is believed that general relativity must be replaced by a quantum theory of gravity at least at extremely high energies of the early universe and at regions of strong curvature of spacetime, cf. black holes. Various attempts to quantize gravity, including conceptually new models such as string theory, have suggested that modification to general relativity might show up even at lower energy scales. On the other hand, also the late time acceleration of the expansion of the universe, known as the dark energy problem, might originate from new gravitational physics. Thus, although there has been no direct experimental evidence contradicting general relativity so far - on the contrary, it has passed a variety of observational tests - it is a question worth asking, why should the effective theory of gravity be of the exact form of general relativity? If general relativity is modified, how do the predictions of the theory change? Furthermore, how far can we go with the changes before we are face with contradictions with the experiments? Along with the changes, could there be new phenomena, which we could measure to find hints of the form of the quantum theory of gravity? This thesis is on a class of modified gravity theories called f(R) models, and in particular on the effects of changing the theory of gravity on stellar solutions. It is discussed how experimental constraints from the measurements in the Solar System restrict the form of f(R) theories. Moreover, it is shown that models, which do not differ from general relativity at the weak field scale of the Solar System, can produce very different predictions for dense stars like neutron stars. Due to the nature of f(R) models, the role of independent connection of the spacetime is emphasized throughout the thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structure and the mechanical properties of wood of Norway spruce (Picea abies [L.] Karst.) were studied using small samples from Finland and Sweden. X-ray diffraction (XRD) was used to determine the orientation of cellulose microfibrils (microfibril angle, MFA), the dimensions of cellulose crystallites and the average shape of the cell cross-section. X-ray attenuation and x-ray fluorescence measurements were used to study the chemical composition and the trace element content. Tensile testing with in situ XRD was used to characterise the mechanical properties of wood and the deformation of crystalline cellulose within the wood cell walls. Cellulose crystallites were found to be 192 284 Å long and 28.9 33.4 Å wide in chemically untreated wood and they were longer and wider in mature wood than in juvenile wood. The MFA distribution of individual Norway spruce tracheids and larger samples was asymmetric. In individual cell walls, the mean MFA was 19 30 degrees, while the mode of the MFA distribution was 7 21 degrees. Both the mean MFA and the mode of the MFA distribution decreased as a function of the annual ring. Tangential cell walls exhibited smaller mean MFA and mode of the MFA distribution than radial cell walls. Maceration of wood material caused narrowing of the MFA distribution and removed contributions observed at around 90 degrees. In wood of both untreated and fertilised trees, the average shape of the cell cross-section changed from circular via ambiguous to rectangular as the cambial age increased. The average shape of the cell cross-section and the MFA distribution did not change as a result of fertilisation. The mass absorption coefficient for x-rays was higher in wood of fertilised trees than in that of untreated trees and wood of fertilised trees contained more of the elements S, Cl, and K, but a smaller amount of Mn. Cellulose crystallites were longer in wood of fertilised trees than in that of untreated trees. Kraft cooking caused widening and shortening of the cellulose crystallites. Tensile tests parallel to the cells showed that if the mean MFA is initially around 10 degrees or smaller, no systematic changes occur in the MFA distribution due to strain. The role of mean MFA in defining the tensile strength or the modulus of elasticity of wood was not as dominant as that reported earlier. Crystalline cellulose elongated much less than the entire samples. The Poisson ratio νca of crystalline cellulose in Norway spruce wood was shown to be largely dependent on the surroundings of crystalline cellulose in the cell wall, varying between -1.2 and 0.8. The Poisson ratio was negative in kraft cooked wood and positive in chemically untreated wood. In chemically untreated wood, νca was larger in mature wood and in latewood compared to juvenile wood and earlywood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Man-induced climate change has raised the need to predict the future climate and its feedback to vegetation. These are studied with global climate models; to ensure the reliability of these predictions, it is important to have a biosphere description that is based upon the latest scientific knowledge. This work concentrates on the modelling of the CO2 exchange of the boreal coniferous forest, studying also the factors controlling its growing season and how these can be used in modelling. In addition, the modelling of CO2 gas exchange at several scales was studied. A canopy-level CO2 gas exchange model was developed based on the biochemical photosynthesis model. This model was first parameterized using CO2 exchange data obtained by eddy covariance (EC) measurements from a Scots pine forest at Sodankylä. The results were compared with a semi-empirical model that was also parameterized using EC measurements. Both of the models gave satisfactory results. The biochemical canopy-level model was further parameterized at three other coniferous forest sites located in Finland and Sweden. At all the sites, the two most important biochemical model parameters showed seasonal behaviour, i.e., their temperature responses changed according to the season. Modelling results were improved when these changeover dates were related to temperature indices. During summer-time the values of the biochemical model parameters were similar at all the four sites. Different control factors for CO2 gas exchange were studied at the four coniferous forests, including how well these factors can be used to predict the initiation and cessation of the CO2 uptake. Temperature indices, atmospheric CO2 concentration, surface albedo and chlorophyll fluorescence (CF) were all found to be useful and have predictive power. In addition, a detailed simulation study of leaf stomata in order to separate physical and biochemical processes was performed. The simulation study brought to light the relative contribution and importance of the physical transport processes. The results of this work can be used in improving CO2 gas exchange models in boreal coniferous forests. The meteorological and biological variables that represent the seasonal cycle were studied, and a method for incorporating this cycle into a biochemical canopy-level model was introduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For achieving efficient fusion energy production, the plasma-facing wall materials of the fusion reactor should ensure long time operation. In the next step fusion device, ITER, the first wall region facing the highest heat and particle load, i.e. the divertor area, will mainly consist of tiles based on tungsten. During the reactor operation, the tungsten material is slowly but inevitably saturated with tritium. Tritium is the relatively short-lived hydrogen isotope used in the fusion reaction. The amount of tritium retained in the wall materials should be minimized and its recycling back to the plasma must be unrestrained, otherwise it cannot be used for fueling the plasma. A very expensive and thus economically not viable solution is to replace the first walls quite often. A better solution is to heat the walls to temperatures where tritium is released. Unfortunately, the exact mechanisms of hydrogen release in tungsten are not known. In this thesis both experimental and computational methods have been used for studying the release and retention of hydrogen in tungsten. The experimental work consists of hydrogen implantations into pure polycrystalline tungsten, the determination of the hydrogen concentrations using ion beam analyses (IBA) and monitoring the out-diffused hydrogen gas with thermodesorption spectrometry (TDS) as the tungsten samples are heated at elevated temperatures. Combining IBA methods with TDS, the retained amount of hydrogen is obtained as well as the temperatures needed for the hydrogen release. With computational methods the hydrogen-defect interactions and implantation-induced irradiation damage can be examined at the atomic level. The method of multiscale modelling combines the results obtained from computational methodologies applicable at different length and time scales. Electron density functional theory calculations were used for determining the energetics of the elementary processes of hydrogen in tungsten, such as diffusivity and trapping to vacancies and surfaces. Results from the energetics of pure tungsten defects were used in the development of an classical bond-order potential for describing the tungsten defects to be used in molecular dynamics simulations. The developed potential was utilized in determination of the defect clustering and annihilation properties. These results were further employed in binary collision and rate theory calculations to determine the evolution of large defect clusters that trap hydrogen in the course of implantation. The computational results for the defect and trapped hydrogen concentrations were successfully compared with the experimental results. With the aforedescribed multiscale analysis the experimental results within this thesis and found in the literature were explained both quantitatively and qualitatively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-dependent backgrounds in string theory provide a natural testing ground for physics concerning dynamical phenomena which cannot be reliably addressed in usual quantum field theories and cosmology. A good, tractable example to study is the rolling tachyon background, which describes the decay of an unstable brane in bosonic and supersymmetric Type II string theories. In this thesis I use boundary conformal field theory along with random matrix theory and Coulomb gas thermodynamics techniques to study open and closed string scattering amplitudes off the decaying brane. The calculation of the simplest example, the tree-level amplitude of n open strings, would give us the emission rate of the open strings. However, even this has been unknown. I will organize the open string scattering computations in a more coherent manner and will argue how to make further progress.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first observations of solar X-rays date back to late 1940 s. In order to observe solar X-rays the instruments have to be lifted above the Earth s atmosphere, since all high energy radiation from the space is almost totally attenuated by it. This is a good thing for all living creatures, but bad for X-ray astronomers. Detectors observing X-ray emission from space must be placed on-board satellites, which makes this particular discipline of astronomy technologically and operationally demanding, as well as very expensive. In this thesis, I have focused on detectors dedicated to observing solar X-rays in the energy range 1-20 keV. The purpose of these detectors was to measure solar X-rays simultaneously with another X-ray spectrometer measuring fluorescence X-ray emission from the Moon surface. The X-ray fluorescence emission is induced by the primary solar X-rays. If the elemental abundances on the Moon were to be determined with fluorescence analysis methods, the shape and intensity of the simultaneous solar X-ray spectrum must be known. The aim of this thesis is to describe the characterization and operation of our X-ray instruments on-board two Moon missions, SMART-1 and Chandrayaan-1. Also the independent solar science performance of these two almost similar X-ray spectrometers is described. These detectors have the following two features in common. Firstly, the primary detection element is made of a single crystal silicon diode. Secondly, the field of view is circular and very large. The data obtained from these detectors are spectra with a 16 second time resolution. Before launching an instrument into space, its performance must be characterized by ground calibrations. The basic operation of these detectors and their ground calibrations are described in detail. Two C-flares are analyzed as examples for introducing the spectral fitting process. The first flare analysis shows the fit of a single spectrum of the C1-flare obtained during the peak phase. The other analysis example shows how to derive the time evolution of fluxes, emission measures (EM) and temperatures through the whole single C4 flare with the time resolution of 16 s. The preparatory data analysis procedures are also introduced in detail. These are required in spectral fittings of the data. A new solar monitor design equipped with a concentrator optics and a moderate size of field of view is also introduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study is to analyse education, employment, and work-life experiences of visually impaired persons in expert jobs. The empirical data consists of 30 thematic interviews (24 visually impaired persons, 1 family-member of a visually impaired person, 5 persons working with diversity issues), of supplementary articles, and of statistics on the socio-economic status of the visually impaired. The interviewees experiences of education and employment have been analysed by a qualitative method. The analysis has been deepened by reflecting it against the recent discussion on the concept of diversity. The author s methodological choice as a disability researcher has been to treat the interviewees as co-researchers rather than objects of research. Accessibility in its different forms is a prerequisite of diversity in the workplace, and this study examines what kind of accessibility is required by visually impaired professionals. Access to working life depends on the attitudes prejudices and expectations that society has towards a minority group. Social accessibility is connected with internal relationships in the workplace, and achieving social accessibility is a bilateral process. Information technology has revolutionised the visually impaired people s possibilities of accessing information and performing expert tasks. Accessible environment, good mobility skills, and transportation services enable visually impaired employees to get to their workplaces and to navigate there with ease. Integration has raised the level of education and widened the selection of career options for the visually impaired. However, even visually impaired people with academic degrees often need employment support services. Visually impaired professionals are mainly employed in the public and third sector. Achieving diversity in the labour market is a multiactor process. Social support services are needed, as well as courage and readiness from employers to hire people with disabilities. The organisations of the visually impaired play an important role in affecting the attitudes and providing peer support. Visually impaired employees need good professional skills, blindness skills, and social courage, and they need to be comfortable with their disability. In the workplace, diversity may actualise as diverse ways of working: the work is done by using technical aids or other means of compensating for the lack of eyesight. When an employee must find compensatory solutions for disability-related limitations at work, this will also develop his/her problem-solving abilities. Key words: visually impaired, diversity, accessibility, working life

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of the study was to determine the kind of communication used in the meetings of the City Council. Councillors' addresses were studied by observing their communication styles. A questionnaire was used to study councillors' perception of communication in council meetings as perceived communication climate. A second goal of the study was to develop a method for analysing communication style. The nature of the study was a longitudinal case study. Studies of speech communication at the individual level and those of organisational communication at the community level served as the theoretical frame of reference. Due to the chosen object of study, political research and communication context shaped the frame of reference for the study. Verbal, nonverbal and paraverbal characteristics form the communication style. The perceived communication climate is dynamic in nature and consists of characteristics. The research material was gathered from the meetings of the Helsinki City Council held from 1993 to 1996. The communication style was analysed from meeting addresses (N=1271) given by the permanent members (N=95) of the council. The perceived communication climate was studied using a questionnaire modified from Wiio's OCD2 questionnaire. The questionnaire survey was carried out twice. The first time, 58 questionnaires (68%) were returned and, the second time, 49 questionnaires (58%) were returned. The method of analysis used for communication style was classification and cross tabulation. Based on the subset of material for the first year, five communication style categories were defined Communication style was examined using gender, length of council membership, and political party as subgroups. The results related to the perceived communication climate were analysed as averages and percentages. A comparison of the results from two different measuring times revealed how the perceived communication climate changed during a council term. The finding of the study was that the municipal politicians could be placed in all five communication style categories in each year of their four-year council term. The sizes of the style categories varied only little in the different years. A relatively stable - rather than changing - communication style was more characteristic of individuals during a council term. Gender did not explain an individual's communication style. On the other hand, the length of service in the council and, to some extent, the political party were connected with a certain communication style. Changes in style mainly manifested themselves as variation in communication activity so that activity was highest at the beginning and at the end of the council term. The method of researching communication style developed for this study proved to be functional, though labour-consuming. The perceived communication climate in the Helsinki City Council was good. Women's and men's satisfaction with the communication in the council differed; women's perception became considerably more positive at the end of the council term. Satisfaction with the communication in the council was characteristic of the Coalition Party and the Social Democrat councillors and dissatisfaction was characteristics of the Green councillors. Long council experience increased satisfaction with communication in the council. Only in the case of a few style categories was it possible to show any connection between the communication style and the perceived communication climate. The study confirmed the perception that an individual's communication style in the same communication context was relatively stable in the long term as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.