7 resultados para Temporal Analysis

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is an empirical study of how two words in Icelandic, "nú" and "núna", are used in contemporary Icelandic conversation. My aims in this study are, first, to explain the differences between the temporal functions of "nú" and "núna", and, second, to describe the non-temporal functions of "nú". In the analysis, a focus is placed on comparing the sequential placement of the two words, on their syntactical distribution, and on their prosodic realization. The empirical data comprise 14 hours and 11 minutes of naturally occurring conversation recorded between 1996 and 2003. The selected conversations represent a wide range of interactional contexts including informal dinner parties, institutional and non-institutional telephone conversations, radio programs for teenagers, phone-in programs, and, finally, a political debate on television. The theoretical and methodological framework is interactional linguistics, which can be described as linguistically oriented conversation analysis (CA). A comparison of "nú" and "núna" shows that the two words have different syntactic distributions. "Nú" has a clear tendency to occur in the front field, before the finite verb, while "núna" typically occurs in the end field, after the object. It is argued that this syntactic difference reflects a functional difference between "nú" and "núna". A sequential analysis of "núna" shows that the word refers to an unspecified period of time which includes the utterance time as well as some time in the past and in the future. This temporal relation is referred to as reference time. "Nú", by contrast, is mainly used in three different environments: a) in temporal comparisons, 2) in transitions, and 3) when the speaker is taking an affective stance. The non-temporal functions of "nú" are divided into three categories: a) "nú" as a tone particle, 2) "nú" as an utterance particle, and 3) "nú" as a dialogue particle. "Nú" as a tone particle is syntactically integrated and can occur in two syntactic positions: pre-verbally and post-verbally. I argue that these instances are employed in utterances in which a speaker is foregrounding information or marking it as particularly important. The study shows that, although these instances are typically prosodically non-prominent and unstressed, they are in some cases delivered with stress and with a higher pitch than the surrounding talk. "Nú" as an utterance particle occurs turn-initially and is syntactically non-integrated. By using "nú", speakers show continuity between turns and link new turns to prior ones. These instances initiate either continuations by the same speaker or new turns after speaker shifts. "Nú" as a dialogue particle occurs as a turn of its own. The study shows that these instances register informings in prior turns as unexpected or as a departure from the normal state of affairs. "Nú" as a dialogue particle is often delivered with a prolonged vowel and a recognizable intonation contour. A comparative sequential and prosodic analysis shows that in these cases there is a correlation between the function of "nú" and the intonation contour by which it is delivered. Finally, I argue that despite the many functions of "nú", all the instances can be said to have a common denominator, which is to display attention towards the present moment and the utterances which are produced prior or after the production of "nú". Instead of anchoring the utterances in external time or reference time, these instances position the utterance in discourse internal time, or discourse time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis contains three subject areas concerning particulate matter in urban area air quality: 1) Analysis of the measured concentrations of particulate matter mass concentrations in the Helsinki Metropolitan Area (HMA) in different locations in relation to traffic sources, and at different times of year and day. 2) The evolution of traffic exhaust originated particulate matter number concentrations and sizes in local street scale are studied by a combination of a dispersion model and an aerosol process model. 3) Some situations of high particulate matter concentrations are analysed with regard to their meteorological origins, especially temperature inversion situations, in the HMA and three other European cities. The prediction of the occurrence of meteorological conditions conducive to elevated particulate matter concentrations in the studied cities is examined. The performance of current numerical weather forecasting models in the case of air pollution episode situations is considered. The study of the ambient measurements revealed clear diurnal variation of the PM10 concentrations in the HMA measurement sites, irrespective of the year and the season of the year. The diurnal variation of local vehicular traffic flows seemed to have no substantial correlation with the PM2.5 concentrations, indicating that the PM10 concentrations were originated mainly from local vehicular traffic (direct emissions and suspension), while the PM2.5 concentrations were mostly of regionally and long-range transported origin. The modelling study of traffic exhaust dispersion and transformation showed that the number concentrations of particles originating from street traffic exhaust undergo a substantial change during the first tens of seconds after being emitted from the vehicle tailpipe. The dilution process was shown to dominate total number concentrations. Minimal effect of both condensation and coagulation was seen in the Aitken mode number concentrations. The included air pollution episodes were chosen on the basis of occurrence in either winter or spring, and having at least partly local origin. In the HMA, air pollution episodes were shown to be linked to predominantly stable atmospheric conditions with high atmospheric pressure and low wind speeds in conjunction with relatively low ambient temperatures. For the other European cities studied, the best meteorological predictors for the elevated concentrations of PM10 were shown to be temporal (hourly) evolutions of temperature inversions, stable atmospheric stability and in some cases, wind speed. Concerning the weather prediction during particulate matter related air pollution episodes, the use of the studied models were found to overpredict pollutant dispersion, leading to underprediction of pollutant concentration levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study offers a reconstruction and critical evaluation of globalization theory, a perspective that has been central for sociology and cultural studies in recent decades, from the viewpoint of media and communications. As the study shows, sociological and cultural globalization theorists rely heavily on arguments concerning media and communications, especially the so-called new information and communication technologies, in the construction of their frameworks. Together with deepening the understanding of globalization theory, the study gives new critical knowledge of the problematic consequences that follow from such strong investment in media and communications in contemporary theory. The book is divided into four parts. The first part presents the research problem, the approach and the theoretical contexts of the study. Followed by the introduction in Chapter 1, I identify the core elements of globalization theory in Chapter 2. At the heart of globalization theory is the claim that recent decades have witnessed massive changes in the spatio-temporal constitution of society, caused by new media and communications in particular, and that these changes necessitate the rethinking of the foundations of social theory as a whole. Chapter 3 introduces three paradigms of media research the political economy of media, cultural studies and medium theory the discussion of which will make it easier to understand the key issues and controversies that emerge in academic globalization theorists treatment of media and communications. The next two parts offer a close reading of four theorists whose works I use as entry points into academic debates on globalization. I argue that we can make sense of mainstream positions on globalization by dividing them into two paradigms: on the one hand, media-technological explanations of globalization and, on the other, cultural globalization theory. As examples of the former, I discuss the works of Manuel Castells (Chapter 4) and Scott Lash (Chapter 5). I maintain that their analyses of globalization processes are overtly media-centric and result in an unhistorical and uncritical understanding of social power in an era of capitalist globalization. A related evaluation of the second paradigm (cultural globalization theory), as exemplified by Arjun Appadurai and John Tomlinson, is presented in Chapter 6. I argue that due to their rejection of the importance of nation states and the notion of cultural imperialism for cultural analysis, and their replacement with a framework of media-generated deterritorializations and flows, these theorists underplay the importance of the neoliberalization of cultures throughout the world. The fourth part (Chapter 7) presents a central research finding of this study, namely that the media-centrism of globalization theory can be understood in the context of the emergence of neoliberalism. I find it problematic that at the same time when capitalist dynamics have been strengthened in social and cultural life, advocates of globalization theory have directed attention to media-technological changes and their sweeping socio-cultural consequences, instead of analyzing the powerful material forces that shape the society and the culture. I further argue that this shift serves not only analytical but also utopian functions, that is, the longing for a better world in times when such longing is otherwise considered impracticable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While extant studies have greatly advanced our understanding of corruption, we still know little of the processes through which specific practices or events come to be labeled as corruption. In a time when public attention devoted to corruption and other forms of corporate misbehavior has exploded, this thesis raises – and seeks to answer – crucial questions related to how the phenomenon is socially and discursively constructed. What kinds of struggles are manifested in public disputes about corruption? How do constructions of corruption relate with broader conceptions of (il)legitimacy in and around organizations? What are the discursive dynamics involved in the emergence and evolution of corruption scandals? The thesis consists of four essays that each employ different research designs and tackle these questions in slightly different theoretical and methodological ways. The empirical focus is on the media coverage of a number of significant and widely discussed scandals in Norway in the period 2003-2008. By illuminating crucial processes through which conceptions of corruption were constructed, reproduced, and transformed in these scandals, the thesis seeks to paint a more nuanced picture of corruption than what is currently offered in the literature. In particular, the thesis challenges traditional conceptions of corruption as a dysfunctional feature of organizations in and of itself by emphasizing the ambiguous, temporal, context-specific, and at times even contradictory features of corruption in public discussions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we analyze how the ion concentrations in forest soil solution are determined by hydrological and biogeochemical processes. A dynamic model ACIDIC was developed, including processes common to dynamic soil acidification models. The model treats up to eight interacting layers and simulates soil hydrology, transpiration, root water and nutrient uptake, cation exchange, dissolution and reactions of Al hydroxides in solution, and the formation of carbonic acid and its dissociation products. It includes also a possibility to a simultaneous use of preferential and matrix flow paths, enabling the throughfall water to enter the deeper soil layers in macropores without first reacting with the upper layers. Three different combinations of routing the throughfall water via macro- and micropores through the soil profile is presented. The large vertical gradient in the observed total charge was simulated succesfully. According to the simulations, gradient is mostly caused by differences in the intensity of water uptake, sulfate adsorption and organic anion retention at the various depths. The temporal variations in Ca and Mg concentrations were simulated fairly well in all soil layers. For H+, Al and K there were much more variation in the observed than in the simulated concentrations. Flow in macropores is a possible explanation for the apparent disequilibrium of the cation exchange for H+ and K, as the solution H+ and K concentrations have great vertical gradients in soil. The amount of exchangeable H+ increased in the O and E horizons and decreased in the Bs1 and Bs2 horizons, the net change in whole soil profile being a decrease. A large part of the decrease of the exchangeable H+ in the illuvial B horizon was caused by sulfate adsorption. The model produces soil water amounts and solution ion concentrations which are comparable to the measured values, and it can be used in both hydrological and chemical studies of soils.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method was developed for relative radiometric calibration of single multitemporal Landsat TM image, several multitemporal images covering each others, and several multitemporal images covering different geographic locations. The radiometricly calibrated difference images were used for detecting rapid changes on forest stands. The nonparametric Kernel method was applied for change detection. The accuracy of the change detection was estimated by inspecting the image analysis results in field. The change classification was applied for controlling the quality of the continuously updated forest stand information. The aim was to ensure that all the manmade changes and any forest damages were correctly updated including the attribute and stand delineation information. The image analysis results were compared with the registered treatments and the stand information base. The stands with discrepancies between these two information sources were recommended to be field inspected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.