960 resultados para Tyson, Ty


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The TOTEM experiment at the LHC will measure the total proton-proton cross-section with a precision better than 1%, elastic proton scattering over a wide range in momentum transfer -t= p^2 theta^2 up to 10 GeV^2 and diffractive dissociation, including single, double and central diffraction topologies. The total cross-section will be measured with the luminosity independent method that requires the simultaneous measurements of the total inelastic rate and the elastic proton scattering down to four-momentum transfers of a few 10^-3 GeV^2, corresponding to leading protons scattered in angles of microradians from the interaction point. This will be achieved using silicon microstrip detectors, which offer attractive properties such as good spatial resolution (<20 um), fast response (O(10ns)) to particles and radiation hardness up to 10^14 "n"/cm^2. This work reports about the development of an innovative structure at the detector edge reducing the conventional dead width of 0.5-1 mm to 50-60 um, compatible with the requirements of the experiment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a growing need to understand the exchange processes of momentum, heat and mass between an urban surface and the atmosphere as they affect our quality of life. Understanding the source/sink strengths as well as the mixing mechanisms of air pollutants is particularly important due to their effects on human health and climate. This work aims to improve our understanding of these surface-atmosphere interactions based on the analysis of measurements carried out in Helsinki, Finland. The vertical exchange of momentum, heat, carbon dioxide (CO2) and aerosol particle number was measured with the eddy covariance technique at the urban measurement station SMEAR III, where the concentrations of ultrafine, accumulation mode and coarse particle numbers, nitrogen oxides (NOx), carbon monoxide (CO), ozone (O3) and sulphur dioxide (SO2) were also measured. These measurements were carried out over varying measurement periods between 2004 and 2008. In addition, black carbon mass concentration was measured at the Helsinki Metropolitan Area Council site during three campaigns in 1996-2005. Thus, the analyzed dataset covered far, the most comprehensive long-term measurements of turbulent fluxes reported in the literature from urban areas. Moreover, simultaneously measured urban air pollution concentrations and turbulent fluxes were examined for the first time. The complex measurement surrounding enabled us to study the effect of different urban covers on the exchange processes from a single point of measurement. The sensible and latent heat fluxes closely followed the intensity of solar radiation, and the sensible heat flux always exceeded the latent heat flux due to anthropogenic heat emissions and the conversion of solar radiation to direct heat in urban structures. This urban heat island effect was most evident during winter nights. The effect of land use cover was seen as increased sensible heat fluxes in more built-up areas than in areas with high vegetation cover. Both aerosol particle and CO2 exchanges were largely affected by road traffic, and the highest diurnal fluxes reached 109 m-2 s-1 and 20 µmol m-2 s-1, respectively, in the direction of the road. Local road traffic had the greatest effect on ultrafine particle concentrations, whereas meteorological variables were more important for accumulation mode and coarse particle concentrations. The measurement surroundings of the SMEAR III station served as a source for both particles and CO2, except in summer, when the vegetation uptake of CO2 exceeded the anthropogenic sources in the vegetation sector in daytime, and we observed a downward median flux of 8 µmol m-2 s-1. This work improved our understanding of the interactions between an urban surface and the atmosphere in a city located at high latitudes in a semi-continental climate. The results can be utilised in urban planning, as the fraction of vegetation cover and vehicular activity were found to be the major environmental drivers affecting most of the exchange processes. However, in order to understand these exchange and mixing processes on a city scale, more measurements above various urban surfaces accompanied by numerical modelling are required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Earth's ecosystems are protected from the dangerous part of the solar ultraviolet (UV) radiation by stratospheric ozone, which absorbs most of the harmful UV wavelengths. Severe depletion of stratospheric ozone has been observed in the Antarctic region, and to a lesser extent in the Arctic and midlatitudes. Concern about the effects of increasing UV radiation on human beings and the natural environment has led to ground based monitoring of UV radiation. In order to achieve high-quality UV time series for scientific analyses, proper quality control (QC) and quality assurance (QA) procedures have to be followed. In this work, practices of QC and QA are developed for Brewer spectroradiometers and NILU-UV multifilter radiometers, which measure in the Arctic and Antarctic regions, respectively. These practices are applicable to other UV instruments as well. The spectral features and the effect of different factors affecting UV radiation were studied for the spectral UV time series at Sodankylä. The QA of the Finnish Meteorological Institute's (FMI) two Brewer spectroradiometers included daily maintenance, laboratory characterizations, the calculation of long-term spectral responsivity, data processing and quality assessment. New methods for the cosine correction, the temperature correction and the calculation of long-term changes of spectral responsivity were developed. Reconstructed UV irradiances were used as a QA tool for spectroradiometer data. The actual cosine correction factor was found to vary between 1.08-1.12 and 1.08-1.13. The temperature characterization showed a linear temperature dependence between the instrument's internal temperature and the photon counts per cycle. Both Brewers have participated in international spectroradiometer comparisons and have shown good stability. The differences between the Brewers and the portable reference spectroradiometer QASUME have been within 5% during 2002-2010. The features of the spectral UV radiation time series at Sodankylä were analysed for the time period 1990-2001. No statistically significant long-term changes in UV irradiances were found, and the results were strongly dependent on the time period studied. Ozone was the dominant factor affecting UV radiation during the springtime, whereas clouds played a more important role during the summertime. During this work, the Antarctic NILU-UV multifilter radiometer network was established by the Instituto Nacional de Meteorogía (INM) as a joint Spanish-Argentinian-Finnish cooperation project. As part of this work, the QC/QA practices of the network were developed. They included training of the operators, daily maintenance, regular lamp tests and solar comparisons with the travelling reference instrument. Drifts of up to 35% in the sensitivity of the channels of the NILU-UV multifilter radiometers were found during the first four years of operation. This work emphasized the importance of proper QC/QA, including regular lamp tests, for the multifilter radiometers also. The effect of the drifts were corrected by a method scaling the site NILU-UV channels to those of the travelling reference NILU-UV. After correction, the mean ratios of erythemally-weighted UV dose rates measured during solar comparisons between the reference NILU-UV and the site NILU-UVs were 1.007±0.011 and 1.012±0.012 for Ushuaia and Marambio, respectively, when the solar zenith angle varied up to 80°. Solar comparisons between the NILU-UVs and spectroradiometers showed a ±5% difference near local noon time, which can be seen as proof of successful QC/QA procedures and transfer of irradiance scales. This work also showed that UV measurements made in the Arctic and Antarctic can be comparable with each other.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work focuses on the role of macroseismology in the assessment of seismicity and probabilistic seismic hazard in Northern Europe. The main type of data under consideration is a set of macroseismic observations available for a given earthquake. The macroseismic questionnaires used to collect earthquake observations from local residents since the late 1800s constitute a special part of the seismological heritage in the region. Information of the earthquakes felt on the coasts of the Gulf of Bothnia between 31 March and 2 April 1883 and on 28 July 1888 was retrieved from the contemporary Finnish and Swedish newspapers, while the earthquake of 4 November 1898 GMT is an example of an early systematic macroseismic survey in the region. A data set of more than 1200 macroseismic questionnaires is available for the earthquake in Central Finland on 16 November 1931. Basic macroseismic investigations including preparation of new intensity data point (IDP) maps were conducted for these earthquakes. Previously disregarded usable observations were found in the press. The improved collection of IDPs of the 1888 earthquake shows that this event was a rare occurrence in the area. In contrast to earlier notions it was felt on both sides of the Gulf of Bothnia. The data on the earthquake of 4 November 1898 GMT were augmented with historical background information discovered in various archives and libraries. This earthquake was of some concern to the authorities, because extra fire inspections were conducted in three towns at least, i.e. Tornio, Haparanda and Piteå, located in the centre of the area of perceptibility. This event posed the indirect hazard of fire, although its magnitude around 4.6 was minor on the global scale. The distribution of slightly damaging intensities was larger than previously outlined. This may have resulted from the amplification of the ground shaking in the soft soil of the coast and river valleys where most of the population was found. The large data set of the 1931 earthquake provided an opportunity to apply statistical methods and assess methodologies that can be used when dealing with macroseismic intensity. It was evaluated using correspondence analysis. Different approaches such as gridding were tested to estimate the macroseismic field from the intensity values distributed irregularly in space. In general, the characteristics of intensity warrant careful consideration. A more pervasive perception of intensity as an ordinal quantity affected by uncertainties is advocated. A parametric earthquake catalogue comprising entries from both the macroseismic and instrumental era was used for probabilistic seismic hazard assessment. The parametric-historic methodology was applied to estimate seismic hazard at a given site in Finland and to prepare a seismic hazard map for Northern Europe. The interpretation of these results is an important issue, because the recurrence times of damaging earthquakes may well exceed thousands of years in an intraplate setting such as Northern Europe. This application may therefore be seen as an example of short-term hazard assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radiation therapy (RT) plays currently significant role in curative treatments of several cancers. External beam RT is carried out mostly by using megavoltage beams of linear accelerators. Tumor eradication and normal tissue complications correlate to dose absorbed in tissues. Normally this dependence is steep and it is crucial that actual dose within patient accurately correspond to the planned dose. All factors in a RT procedure contain uncertainties requiring strict quality assurance. From hospital physicist´s point of a view, technical quality control (QC), dose calculations and methods for verification of correct treatment location are the most important subjects. Most important factor in technical QC is the verification that radiation production of an accelerator, called output, is within narrow acceptable limits. The output measurements are carried out according to a locally chosen dosimetric QC program defining measurement time interval and action levels. Dose calculation algorithms need to be configured for the accelerators by using measured beam data. The uncertainty of such data sets limits for best achievable calculation accuracy. All these dosimetric measurements require good experience, are workful, take up resources needed for treatments and are prone to several random and systematic sources of errors. Appropriate verification of treatment location is more important in intensity modulated radiation therapy (IMRT) than in conventional RT. This is due to steep dose gradients produced within or close to healthy tissues locating only a few millimetres from the targeted volume. The thesis was concentrated in investigation of the quality of dosimetric measurements, the efficacy of dosimetric QC programs, the verification of measured beam data and the effect of positional errors on the dose received by the major salivary glands in head and neck IMRT. A method was developed for the estimation of the effect of the use of different dosimetric QC programs on the overall uncertainty of dose. Data were provided to facilitate the choice of a sufficient QC program. The method takes into account local output stability and reproducibility of the dosimetric QC measurements. A method based on the model fitting of the results of the QC measurements was proposed for the estimation of both of these factors. The reduction of random measurement errors and optimization of QC procedure were also investigated. A method and suggestions were presented for these purposes. The accuracy of beam data was evaluated in Finnish RT centres. Sufficient accuracy level was estimated for the beam data. A method based on the use of reference beam data was developed for the QC of beam data. Dosimetric and geometric accuracy requirements were evaluated for head and neck IMRT when function of the major salivary glands is intended to be spared. These criteria are based on the dose response obtained for the glands. Random measurement errors could be reduced enabling lowering of action levels and prolongation of measurement time interval from 1 month to even 6 months simultaneously maintaining dose accuracy. The combined effect of the proposed methods, suggestions and criteria was found to facilitate the avoidance of maximal dose errors of up to even about 8 %. In addition, their use may make the strictest recommended overall dose accuracy level of 3 % (1SD) achievable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, we live in an era characterized by the completion and first runs of the LHC accelerator at CERN, which is hoped to provide the first experimental hints of what lies beyond the Standard Model of particle physics. In addition, the last decade has witnessed a new dawn of cosmology, where it has truly emerged as a precision science. Largely due to the WMAP measurements of the cosmic microwave background, we now believe to have quantitative control of much of the history of our universe. These two experimental windows offer us not only an unprecedented view of the smallest and largest structures of the universe, but also a glimpse at the very first moments in its history. At the same time, they require the theorists to focus on the fundamental challenges awaiting at the boundary of high energy particle physics and cosmology. What were the contents and properties of matter in the early universe? How is one to describe its interactions? What kind of implications do the various models of physics beyond the Standard Model have on the subsequent evolution of the universe? In this thesis, we explore the connection between in particular supersymmetric theories and the evolution of the early universe. First, we provide the reader with a general introduction to modern day particle cosmology from two angles: on one hand by reviewing our current knowledge of the history of the early universe, and on the other hand by introducing the basics of supersymmetry and its derivatives. Subsequently, with the help of the developed tools, we direct the attention to the specific questions addressed in the three original articles that form the main scientific contents of the thesis. Each of these papers concerns a distinct cosmological problem, ranging from the generation of the matter-antimatter asymmetry to inflation, and finally to the origin or very early stage of the universe. They nevertheless share a common factor in their use of the machinery of supersymmetric theories to address open questions in the corresponding cosmological models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polar Regions are an energy sink of the Earth system, as the Sun rays do not reach the Poles for half of the year, and hit them only at very low angles for the other half of the year. In summer, solar radiation is the dominant energy source for the Polar areas, therefore even small changes in the surface albedo strongly affect the surface energy balance and, thus, the speed and amount of snow and ice melting. In winter, the main heat sources for the atmosphere are the cyclones approaching from lower latitudes, and the atmosphere-surface heat transfer takes place through turbulent mixing and longwave radiation, the latter dominated by clouds. The aim of this thesis is to improve the knowledge about the surface and atmospheric processes that control the surface energy budget over snow and ice, with particular focus on albedo during the spring and summer seasons, on horizontal advection of heat, cloud longwave forcing, and turbulent mixing during the winter season. The critical importance of a correct albedo representation in models is illustrated through the analysis of the causes for the errors in the surface and near-surface air temperature produced in a short-range numerical weather forecast by the HIRLAM model. Then, the daily and seasonal variability of snow and ice albedo have been examined by analysing field measurements of albedo, carried out in different environments. On the basis of the data analysis, simple albedo parameterizations have been derived, which can be implemented into thermodynamic sea ice models, as well as numerical weather prediction and climate models. Field measurements of radiation and turbulent fluxes over the Bay of Bothnia (Baltic Sea) also allowed examining the impact of a large albedo change during the melting season on surface energy and ice mass budgets. When high contrasts in surface albedo are present, as in the case of snow covered areas next to open water, the effect of the surface albedo heterogeneity on the downwelling solar irradiance under overcast condition is very significant, although it is usually not accounted for in single column radiative transfer calculations. To account for this effect, an effective albedo parameterization based on three-dimensional Monte Carlo radiative transfer calculations has been developed. To test a potentially relevant application of the effective albedo parameterization, its performance in the ground-based retrieval of cloud optical depth was illustrated. Finally, the factors causing the large variations of the surface and near-surface temperatures over the Central Arctic during winter were examined. The relative importance of cloud radiative forcing, turbulent mixing, and lateral heat advection on the Arctic surface temperature were quantified through the analysis of direct observations from Russian drifting ice stations, with the lateral heat advection calculated from reanalysis products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solar UV radiation is harmful for life on planet Earth, but fortunately the atmospheric oxygen and ozone absorb almost entirely the most energetic UVC radiation photons. However, part of the UVB radiation and much of the UVA radiation reaches the surface of the Earth, and affect human health, environment, materials and drive atmospheric and aquatic photochemical processes. In order to quantify these effects and processes there is a need for ground-based UV measurements and radiative transfer modeling to estimate the amounts of UV radiation reaching the biosphere. Satellite measurements with their near-global spatial coverage and long-term data conti-nuity offer an attractive option for estimation of the surface UV radiation. This work focuses on radiative transfer theory based methods used for estimation of the UV radiation reaching the surface of the Earth. The objectives of the thesis were to implement the surface UV algorithm originally developed at NASA Goddard Space Flight Center for estimation of the surface UV irradiance from the meas-urements of the Dutch-Finnish built Ozone Monitoring Instrument (OMI), to improve the original surface UV algorithm especially in relation with snow cover, to validate the OMI-derived daily surface UV doses against ground-based measurements, and to demonstrate how the satellite-derived surface UV data can be used to study the effects of the UV radiation. The thesis consists of seven original papers and a summary. The summary includes an introduction of the OMI instrument, a review of the methods used for modeling of the surface UV using satellite data as well as the con-clusions of the main results of the original papers. The first two papers describe the algorithm used for estimation of the surface UV amounts from the OMI measurements as well as the unique Very Fast Delivery processing system developed for processing of the OMI data received at the Sodankylä satellite data centre. The third and the fourth papers present algorithm improvements related to the surface UV albedo of the snow-covered land. Fifth paper presents the results of the comparison of the OMI-derived daily erythemal doses with those calculated from the ground-based measurement data. It gives an estimate of the expected accuracy of the OMI-derived sur-face UV doses for various atmospheric and other conditions, and discusses the causes of the differences between the satellite-derived and ground-based data. The last two papers demonstrate the use of the satellite-derived sur-face UV data. Sixth paper presents an assessment of the photochemical decomposition rates in aquatic environment. Seventh paper presents use of satellite-derived daily surface UV doses for planning of the outdoor material weathering tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray Raman scattering and x-ray emission spectroscopies were used to study the electronic properties and phase transitions in several condensed matter systems. The experimental work, carried out at the European Synchrotron Radiation Facility, was complemented by theoretical calculations of the x-ray spectra and of the electronic structure. The electronic structure of MgB2 at the Fermi level is dominated by the boron σ and π bands. The high density of states provided by these bands is the key feature of the electronic structure contributing to the high critical temperature of superconductivity in MgB2. The electronic structure of MgB2 can be modified by atomic substitutions, which introduce extra electrons or holes into the bands. X ray Raman scattering was used to probe the interesting σ and π band hole states in pure and aluminum substituted MgB2. A method for determining the final state density of electron states from experimental x-ray Raman scattering spectra was examined and applied to the experimental data on both pure MgB2 and on Mg(0.83)Al(0.17)B2. The extracted final state density of electron states for the pure and aluminum substituted samples revealed clear substitution induced changes in the σ and π bands. The experimental work was supported by theoretical calculations of the electronic structure and x-ray Raman spectra. X-ray emission at the metal Kβ line was applied to the studies of pressure and temperature induced spin state transitions in transition metal oxides. The experimental studies were complemented by cluster multiplet calculations of the electronic structure and emission spectra. In LaCoO3 evidence for the appearance of an intermediate spin state was found and the presence of a pressure induced spin transition was confirmed. Pressure induced changes in the electronic structure of transition metal monoxides were studied experimentally and were analyzed using the cluster multiplet approach. The effects of hybridization, bandwidth and crystal field splitting in stabilizing the high pressure spin state were discussed. Emission spectroscopy at the Kβ line was also applied to FeCO3 and a pressure induced iron spin state transition was discovered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a novel application of x-ray Compton scattering to structural studies of molecular liquids. Systematic Compton-scattering experiments on water have been carried out with unprecedented accuracy at third-generation synchrotron-radiation laboratories. The experiments focused on temperature effects in water, the water-to-ice phase transition, quantum isotope effects, and ion hydration. The experimental data is interpreted by comparison with both model computations and ab initio molecular-dynamics simulations. Accordingly, Compton scattering is found to provide unique intra- and intermolecular structural information. This thesis thus demonstrates the complementarity of the technique to traditional real-space probes for studies on the local structure of water and, more generally, molecular liquids.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Varttuminen vietnamilaisena Suomessa: 12 vuoden seurantajakso – Vietnamilaisten hyvinvointi ja sosiokulttuurinen sopeutuminen lapsena/nuorena sekä nuorena aikuisena Tämä tutkimus oli määrällinen pitkittäistutkimus lapsena tai nuorena vuosina 1979-1991 Suomeen saapuneiden vietnamilaisten akkulturaatiosta (kulttuurin muutoksista), psyykkisestä hyvinvoinnista ja sosiokulttuurisesta sopeutumisesta. Tutkimukseen osallistui ensimmäisessä vaiheessa (vuonna 1992) 97 satunnaisesti valittua vietnamilaista peruskoululaista ympäri maata, joita verrattin suomalaisiin luokkatovereihin. Seurantavaiheeseen (vuonna 2004) osallistui 59 ensimmäisessä vaiheessa mukana ollutta vietnamilaista, nyt iältään 20 – 31 -vuotiaita. Tutkimuksen tavoitteena oli selvittää mitkä tekijät ennustivat akkulturaation lopputuloksia, samalla huomioiden iän ja ympäristön (kontekstin) vaikutukset psyykkiseen hyvinvointiin ja sosiokulttuuriseen sopeutumiseen. Yksittäiset akkulturaatiodimensiot (kieli, arvot ja identiteetti) osoittautuivat tärkeämmiksi psyykkiselle hyvinvoinnille ja sosiokulttuuriselle sopeutumiselle kuin etniset, kansalliset tai kaksikulttuuriset profiilit, joissa yhdistyivät ao. kieli, arvot ja identiteetti. Identiteettimuutosta tapahtui (etniseen) vietnamilaiseen suuntaan ajan kuluessa, kun taas arvomuutosta tapahtui (kansalliseen) suomalaiseen suuntaan. Sekä suomen että vietnamin kielen taito lisääntyivät ajan myötä, millä oli myönteisiä vaikutuksia sekä psyykkiseen hyvinvointiin että sosiokulttuuriseen sopeutumiseen. Lähtötilanteen psyykkinen hyvinvointi ennusti hyvinvointia (masennuksen puutetta ja itsetuntoa) aikuisena, mutta sosiokulttuurinen sopeutuminen (koulumenestys) lapsena tai nuorena ei ennustanut kouluttautumista aikuisena. Parempi suomen kielen taito ja vähemmän identifioitumista suomalaiseksi aikuisena sekä masentuneisuuden puute ja vähemmän koettua syrjintää lapsena tai nuorena erottelivat psyykkisesti paremmin voivat aikuiset (ei-masentuneet) heistä, jotka olivat masentuneita. Parempaa kouluttautumista aikuisena ennustivat toisaalta vähemmän koettua syrjintää lapsena tai nuorena ja toisaalta aikuisena parempi suomen kielen taito, suurempi kansallisten (suomalaisten) itsenäisyysarvojen kannattaminen, mutta kuitenkin vähemmän identifioitumista suomalaisiin. Koetun syrjinnän merkitys psyykkiselle hyvinvoinnille, erityisesti lapsena tai nuorena, sekä sen pitkäaikaisvaikutukset psyykkiselle hyvinvoinnille ja sosiokulttuuriselle sopeutumiselle aikuisena osoittavat tarpeen puuttua varhain psyykkisiin ongelmiin sekä parantaa etnisten ryhmien välisiä suhteita. Avainsanat: akkulturaatio, psyykkinen hyvinvointi, sosiokultuurinen sopeutuminen, kieli, arvot, identiteetti, vietnamilainen, Suomi, lapset, nuoret, nuoret aikuiset

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study is to analyse education, employment, and work-life experiences of visually impaired persons in expert jobs. The empirical data consists of 30 thematic interviews (24 visually impaired persons, 1 family-member of a visually impaired person, 5 persons working with diversity issues), of supplementary articles, and of statistics on the socio-economic status of the visually impaired. The interviewees experiences of education and employment have been analysed by a qualitative method. The analysis has been deepened by reflecting it against the recent discussion on the concept of diversity. The author s methodological choice as a disability researcher has been to treat the interviewees as co-researchers rather than objects of research. Accessibility in its different forms is a prerequisite of diversity in the workplace, and this study examines what kind of accessibility is required by visually impaired professionals. Access to working life depends on the attitudes prejudices and expectations that society has towards a minority group. Social accessibility is connected with internal relationships in the workplace, and achieving social accessibility is a bilateral process. Information technology has revolutionised the visually impaired people s possibilities of accessing information and performing expert tasks. Accessible environment, good mobility skills, and transportation services enable visually impaired employees to get to their workplaces and to navigate there with ease. Integration has raised the level of education and widened the selection of career options for the visually impaired. However, even visually impaired people with academic degrees often need employment support services. Visually impaired professionals are mainly employed in the public and third sector. Achieving diversity in the labour market is a multiactor process. Social support services are needed, as well as courage and readiness from employers to hire people with disabilities. The organisations of the visually impaired play an important role in affecting the attitudes and providing peer support. Visually impaired employees need good professional skills, blindness skills, and social courage, and they need to be comfortable with their disability. In the workplace, diversity may actualise as diverse ways of working: the work is done by using technical aids or other means of compensating for the lack of eyesight. When an employee must find compensatory solutions for disability-related limitations at work, this will also develop his/her problem-solving abilities. Key words: visually impaired, diversity, accessibility, working life

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study offers a reconstruction and critical evaluation of globalization theory, a perspective that has been central for sociology and cultural studies in recent decades, from the viewpoint of media and communications. As the study shows, sociological and cultural globalization theorists rely heavily on arguments concerning media and communications, especially the so-called new information and communication technologies, in the construction of their frameworks. Together with deepening the understanding of globalization theory, the study gives new critical knowledge of the problematic consequences that follow from such strong investment in media and communications in contemporary theory. The book is divided into four parts. The first part presents the research problem, the approach and the theoretical contexts of the study. Followed by the introduction in Chapter 1, I identify the core elements of globalization theory in Chapter 2. At the heart of globalization theory is the claim that recent decades have witnessed massive changes in the spatio-temporal constitution of society, caused by new media and communications in particular, and that these changes necessitate the rethinking of the foundations of social theory as a whole. Chapter 3 introduces three paradigms of media research the political economy of media, cultural studies and medium theory the discussion of which will make it easier to understand the key issues and controversies that emerge in academic globalization theorists treatment of media and communications. The next two parts offer a close reading of four theorists whose works I use as entry points into academic debates on globalization. I argue that we can make sense of mainstream positions on globalization by dividing them into two paradigms: on the one hand, media-technological explanations of globalization and, on the other, cultural globalization theory. As examples of the former, I discuss the works of Manuel Castells (Chapter 4) and Scott Lash (Chapter 5). I maintain that their analyses of globalization processes are overtly media-centric and result in an unhistorical and uncritical understanding of social power in an era of capitalist globalization. A related evaluation of the second paradigm (cultural globalization theory), as exemplified by Arjun Appadurai and John Tomlinson, is presented in Chapter 6. I argue that due to their rejection of the importance of nation states and the notion of cultural imperialism for cultural analysis, and their replacement with a framework of media-generated deterritorializations and flows, these theorists underplay the importance of the neoliberalization of cultures throughout the world. The fourth part (Chapter 7) presents a central research finding of this study, namely that the media-centrism of globalization theory can be understood in the context of the emergence of neoliberalism. I find it problematic that at the same time when capitalist dynamics have been strengthened in social and cultural life, advocates of globalization theory have directed attention to media-technological changes and their sweeping socio-cultural consequences, instead of analyzing the powerful material forces that shape the society and the culture. I further argue that this shift serves not only analytical but also utopian functions, that is, the longing for a better world in times when such longing is otherwise considered impracticable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyzes the forming of the occupational identity of the well-educated fixed-term employees. Fixed-term employment contracts amongst the well-educated labour force are exceptionally common in Finland as compared to other European countries. Two groups of modern fixed-term employees are distinguished. The first comprises well-educated women employed in the public sector whose fixed-term employment often consists of successive periods as temporary substitutes. The other group comprises well-educated, upper white-collar men aged over 40, whose fixed-term employment careers often consist of jobs of project nature or posts that are filled for a fixed period only. Method of the study For the empirical data I interviewed 35 persons (26 women and 9 men) in 33 interviews, one of which was conducted by e-mail and one was a group interview. All the interviews were electronically recorded and coded. All the interviewees have two things in common: fixed-term employment and formal high education. Thirteen (13) of them are researchers, four nurses, four midwives, four journalists, and ten project experts. I used the snowball method to get in touch the interviewees. The first interviewees were those who were recommended by the trade unions and by my personal acquaintances. These interviewees, in turn, recommended other potential interviewees. In addition, announcements on the internet pages of the trade unions were used to reach other interviewees. In analysing process I read the research material several times to find the turning points in the narrative the interviewees told. I also searched for the most meaningful stories told and the meaning the interviewees gave to these stories and to the whole narrative. In addition to that I paid attention to co-production of the narrative with the interviewees and analyzed the narrative as performance to be able to search for the preferred identities the interviewees perform. (Riesman 2001, 698-701). I do not pay much attention to the question of truth of a narrative in the sense of its correspondence with facts; rather I think a working life narrative has two tasks: On the one hand one has to tell the facts and on the other hand, he/she has to describe the meaning of these facts to herself/himself. To emphasize the double nature of the narrative about one’s working life I analyzed the empirical data both by categorizing it according to the cultural models of storytelling (heroic story, comedy, irony and tragedy) and by studying the themes most of the interviewees talked about. Ethics of the study I chose to use narrative within qualitative interviews on the grounds that in my opinion is more ethical and more empowering than the more traditional structured interview methods. During the research process I carefully followed the ethical rules of a qualitative research. The purpose of the interviews and the research was told to the interviewees by giving them a written description of the study. Oral permission to use the interview in this research was obtained from the interviewees. The names and places, which are mentioned in the study, are changed to conceal the actual identity of the interviewees. I shared the analysis with the interviewees by sending each of them the first analysis of their personal interview. This way I asked them to make sure that the identity was hidden well enough and hoped to give interviewees a chance to look at their narratives, to instigate new actions and sustain the present one (Smith 2001, 721). Also I hoped to enjoy a new possibility of joint authorship. Main results As a result of the study I introduce six models of telling a story. The four typical western cultural models that guide the telling are: heroic story, comedy, tragedy and satirical story (Hänninen 1999). In addition to these models I found two ways of telling a career filled with fixed-term employments that differ significantly from traditional career story telling. However, the story models in which the interviewees pour their experience locates the fixed term employers work career in an imagined life trajectory and reveals the meaning they give to it. I analyze the many sided heroic story that Liisa tells as an example of the strength of the fear of failing or losing the job the fixed term employee feels. By this structure it is also possible to show that success is felt to be entirely a matter of chance. Tragedy, the failure in one’s trial to get something, is a model I introduce with the help of Vilppu’s story. This narrative gets its meaning both from the sorrow of the failure in the past and the rise of something new the teller has found. Aino tells her story as a comedy. By introducing her narrative, I suggest that the purpose of the comedy, a stronger social consensus, gets deeper and darker shade by fixed-term employment: one who works as a fixed term employee has to take his/her place in his/her work community by him/herself without the support the community gives to those in permanent position. By studying the satiric model Rauno uses, I argue that using irony both turns the power structures to a carnival and builds free space to the teller of the story and to the listener. Irony also helps in building a consensus, mutual understanding, between the teller and the listener and it shows the distance the teller tells to exist between him and others. Irony, however, demands some kind of success in one’s occupational career but also at least a minor disappointment in the progress of it. Helmi tells her story merely as a detective story. By introducing Helmi’s narrative, I argue that this story model strengthens the trust in fairness of the society the teller and the listener share. The analysis also emphasizes the central position of identity work, which is caused by fixed-term employment. Most of the interviewees talked about getting along in working life. I introduced Sari’s narrative as an example of this. In both of these latter narratives one’s personal character and habits are lifted as permanent parts of the actual professional expertise, which in turn varies according to different situations. By introducing these models, I reveal that the fixed-term employees have different strategies to cope with their job situations and these strategies vary according to their personal motives and situations and the actual purpose of the interview. However, I argue that they feel the space between their hopes and fears narrow and unsecure. In the research report I also introduce pieces of the stories – themes – that the interviewees use to build these survival strategies. They use their personal curriculum vitae or portfolio, their position in work community and their work morals to build their professional identity. Professional identity is flexible and varies in time and place, but even then it offers a tool to fix one’s identity work into something. It offers a viewpoint to society and a tool to measure one’s position in surrounding social nets. As one result of the study I analyze the position the fixed-term employees share on the edge of their job communities. I summarize the hopes and fears the interviewees have concerning employers, trade unions, educational institutions and the whole society. In their opinion, the solidarity between people has been weakened by the short-sighted power of the economy. The impact the fixed-term employment has on one’s professional identity and social capital is a many-sided and versatile process. Fixed-term employment both strengthens and weakens the professional identity, social capital and the building of trust. Fixed-term employment also affects one’s day-to-day life by excluding her/him from the norm and by one’s difficulty in making long-term plans (Jokinen 2005). Regardless of the nature of the job contract, the workers themselves are experts in making the best of their sometimes less than satisfying work life and they also build their professional identity by using creatively their education, work experiences and interpersonal relations. However, a long career of short fixed-term employments may seriously change the perception of employee about his/her role. He/she may start concentrating only in coping in his/her unsatisfactory situation and leaves the active improvement of the lousy working conditions to other people. Keywords: narrative, fixed-tem employment, occupational identity, work, story model, social capital, career