50 resultados para Reasonable profits


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimal Punishment of Economic Crime: A Study on Bankruptcy Crime This thesis researches whether the punishment practise of bankruptcy crimes is optimal in light of Gary S. Becker’s theory of optimal punishment. According to Becker, a punishment is optimal if it eliminates the expected utility of the crime for the offender and - on the other hand - minimizes the cost of the crime to society. The decision process of the offender is observed through their expected utility of the crime. The expected utility is calculated based on the offender's probability of getting caught, the cost of getting caught and the profit from the crime. All objects including the punishment are measured in cash. The cost of crimes to the society is observed defining the disutility caused by the crime to the society. The disutility is calculated based on the cost of crime prevention, crime damages, punishment execution and the probability of getting caught. If the goal is to minimize the crime profits, the punishments of bankruptcy crimes are not optimal. If the debtors would decide whether or not to commit the crime solely based on economical consideration, the crime rate would be multiple times higher than the current rate is. The prospective offender relies heavily on non-economic aspects in their decision. Most probably social pressure and personal commitment to oblige the laws are major factors in the prospective criminal’s decision-making. The function developed by Becker measuring the cost to society was not useful in the measurement of the optimality of a punishment. The premise of the function that the costs of the society correlate to the costs for the offender from the punishment proves to be unrealistic in observation of the bankruptcy crimes. However, it was observed that majority of the cost of crime for the society are caused by the crime damages. This finding supports the preventive criminal politics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Väitöskirjassani tarkastelen informaatiohyödykkeiden ja tekijänoikeuksien taloustiedettä kahdesta eri perspektiivistä. Niistä ensimmäinen kuuluu endogeenisen kasvuteorian alaan. Väitöskirjassani yleistän ”pool of knowledge” -tyyppisen endogeenisen kasvumallin tilanteeseen, jossa patentoitavissa olevalla innovaatiolla on minimikoko, ja jossa uudenlaisen tuotteen patentoinut yritys voi menettää monopolinsa tuotteeseen jäljittelyn johdosta. Mallin kontekstissa voidaan analysoida jäljittelyn ja innovaatioilta vaaditun ”minimikoon” vaikutuksia hyvinvointiin ja talouskasvuun. Kasvun maksimoiva imitaation määrä on mallissa aina nolla, mutta hyvinvoinnin maksimoiva imitaation määrä voi olla positiivinen. Talouskasvun ja hyvinvoinnin maksimoivalla patentoitavissa olevan innovaation ”minimikoolla” voi olla mikä tahansa teoreettista maksimia pienempi arvo. Väitöskirjani kahdessa jälkimmäisessä pääluvussa tarkastelen informaatiohyödykkeiden kaupallista piratismia mikrotaloustieteellisen mallin avulla. Informaatiohyödykkeistä laittomasti tehtyjen kopioiden tuotantokustannukset ovat pienet, ja miltei olemattomat silloin kun niitä levitetään esimerkiksi Internetissä. Koska piraattikopioilla on monta eri tuottajaa, niiden hinnan voitaisiin mikrotaloustieteen teorian perusteella olettaa laskevan melkein nollaan, ja jos näin kävisi, kaupallinen piratismi olisi mahdotonta. Mallissani selitän kaupallisen piratismin olemassaolon olettamalla, että piratismista saatavan rangaistuksen uhka riippuu siitä, kuinka monille kuluttajille piraatti tarjoaa laittomia hyödykkeitä, ja että se siksi vaikuttaa piraattikopioiden markkinoihin mainonnan kustannuksen tavoin. Kaupallisten piraattien kiinteiden kustannusten lisääminen on mallissani aina tekijänoikeuksien haltijan etujen mukaista, mutta ”mainonnan kustannuksen” lisääminen ei välttämättä ole, vaan se saattaa myös alentaa laillisten kopioiden myynnistä saatavia voittoja. Tämä tulos poikkeaa vastaavista aiemmista tuloksista sikäli, että se pätee vaikka tarkasteltuihin informaatiohyödykkeisiin ei liittyisi verkkovaikutuksia. Aiemmin ei-kaupallisen piratismin malleista on usein johdettu tulos, jonka mukaan informaatiohyödykkeen laittomat kopiot voivat kasvattaa laillisten kopioiden myynnistä saatavia voittoja jos laillisten kopioiden arvo niiden käyttäjille riippuu siitä, kuinka monet muut kuluttajat käyttävät samanlaista hyödykettä ja jos piraattikopioiden saatavuus lisää riittävästi laillisten kopioiden arvoa. Väitöskirjan viimeisessä pääluvussa yleistän mallini verkkotoimialoille, ja tutkin yleistämäni mallin avulla sitä, missä tapauksissa vastaava tulos pätee myös kaupalliseen piratismiin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation explored the ecological dimension of ecologically sustainable forest management in boreal forests, and factors of the socio-cultural dimension that affect how the concept of ecologically sustainable forest management is defined. My approach was problem-oriented and generalistic-holistic. I examined associations between the abundances of wildlife groups (grouse, large predators, small predators, ungulates) and Siberian flying squirrels, and their co-occurrence with tree structural characteristics at the regional level. The trade-offs between ecological, social and economic sustainability in forestry were explored at the regional scale. I identified a potential 'shopping basket' of regional indicators for ecologically sustainable forest management, combining the relative abundance of Siberian flying squirrels, a wildlife richness index (WRI) for grouse, diversity indices of saw-timber trees, tree age classes and the proportion of old-growth (> 120 yr) forests. I suggest that the close association between forestry activity, the proportion of young forests (< 40 yr) and a WRI for small predators can be considered as potential 'alarm bells' for regions in which the creation of trade-offs (negative relationships) between economic and ecological components of sustainable forestry is ongoing. Explorative analyses revealed negative relationships between forestry activity and a WRI of 16 game species, the WRI for grouse and tree age diversity. Socially sustainable communities compete less intensively with ecological components of forests than communities where forestry is important. Interestingly, forest ownership types (farmers, other private forest owners, the forestry industry, the State) correlated significantly with the co-occurrence of flying squirrels, grouse and diverse forest structural characteristics rather than, for instance, with the total number of protection areas, suggesting that private forest ownership can lead to increased ecological sustainability. I examined forest actors’ argumentation to identify characteristics that affect the interpretation of ecologically sustainable forest management. Four argumentation frame types were constructed: information, work, experience and own position based. These differed in terms of their emphasis on external experts or own experiences. The closer ecologically sustainable forest management is to the forest actor’s daily life, the more profiled policy tools (counselling, learning through experiences) are needed to guide management behaviour to become more ecologically sound. I illustrated that forest actors interpret, use and understand information through meaningful framing. I analysed the extent to which ecological research information has been perceived in the Forestry Development Centre TAPIO’s recommendations and revised PEFC Finland criteria. We noticed that the political value for decaying wood was much lower in PEFC Finland critera (4 m3) than could be expected as a socially acceptable level (9 m3) or ecologically sound (10-20 m3). I consider it important for scientists to join political discourses and become involved in policy making concerning sustainable forest management to learn to present their results in a way that is reasonable from the user’s perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increase in global temperature has been attributed to increased atmospheric concentrations of greenhouse gases (GHG), mainly that of CO2. The threat of severe and complex socio-economic and ecological implications of climate change have initiated an international process that aims to reduce emissions, to increase C sinks, and to protect existing C reservoirs. The famous Kyoto protocol is an offspring of this process. The Kyoto protocol and its accords state that signatory countries need to monitor their forest C pools, and to follow the guidelines set by the IPCC in the preparation, reporting and quality assessment of the C pool change estimates. The aims of this thesis were i) to estimate the changes in carbon stocks vegetation and soil in the forests in Finnish forests from 1922 to 2004, ii) to evaluate the applied methodology by using empirical data, iii) to assess the reliability of the estimates by means of uncertainty analysis, iv) to assess the effect of forest C sinks on the reliability of the entire national GHG inventory, and finally, v) to present an application of model-based stratification to a large-scale sampling design of soil C stock changes. The applied methodology builds on the forest inventory measured data (or modelled stand data), and uses statistical modelling to predict biomasses and litter productions, as well as a dynamic soil C model to predict the decomposition of litter. The mean vegetation C sink of Finnish forests from 1922 to 2004 was 3.3 Tg C a-1, and in soil was 0.7 Tg C a-1. Soil is slowly accumulating C as a consequence of increased growing stock and unsaturated soil C stocks in relation to current detritus input to soil that is higher than in the beginning of the period. Annual estimates of vegetation and soil C stock changes fluctuated considerably during the period, were frequently opposite (e.g. vegetation was a sink but soil was a source). The inclusion of vegetation sinks into the national GHG inventory of 2003 increased its uncertainty from between -4% and 9% to ± 19% (95% CI), and further inclusion of upland mineral soils increased it to ± 24%. The uncertainties of annual sinks can be reduced most efficiently by concentrating on the quality of the model input data. Despite the decreased precision of the national GHG inventory, the inclusion of uncertain sinks improves its accuracy due to the larger sectoral coverage of the inventory. If the national soil sink estimates were prepared by repeated soil sampling of model-stratified sample plots, the uncertainties would be accounted for in the stratum formation and sample allocation. Otherwise, the increases of sampling efficiency by stratification remain smaller. The highly variable and frequently opposite annual changes in ecosystem C pools imply the importance of full ecosystem C accounting. If forest C sink estimates will be used in practice average sink estimates seem a more reasonable basis than the annual estimates. This is due to the fact that annual forest sinks vary considerably and annual estimates are uncertain, and they have severe consequences for the reliability of the total national GHG balance. The estimation of average sinks should still be based on annual or even more frequent data due to the non-linear decomposition process that is influenced by the annual climate. The methodology used in this study to predict forest C sinks can be transferred to other countries with some modifications. The ultimate verification of sink estimates should be based on comparison to empirical data, in which case the model-based stratification presented in this study can serve to improve the efficiency of the sampling design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work focuses on the factors affecting species richness, abundance and species composition of butterflies and moths in Finnish semi-natural grasslands, with a special interest in the effects of grazing management. In addition, an aim was set at evaluating the effectiveness of the support for livestock grazing in semi-natural grasslands, which is included in the Finnish agri-environment scheme. In the first field study, butterfly and moth communities in resumed semi-natural pastures were com-pared to old, annually grazed and abandoned previous pastures. Butterfly and moth species compo-sition in restored pastures resembled the compositions observed in old pastures after circa five years of resumed cattle grazing, but diversity of butterflies and moths in resumed pastures remained at a lower level compared with old pastures. None of the butterfly and moth species typical of old pas-tures had become more abundant in restored pastures compared with abandoned pastures. There-fore, it appears that restoration of butterfly and moth communities inhabiting semi-natural grass-lands requires a longer time that was available for monitoring in this study. In the second study, it was shown that local habitat quality has the largest impact on the occurrence and abundance of butterflies and moths compared to the effects of grassland patch area and connec-tivity of the regional grassland network. This emphasizes the importance of current and historical management of semi-natural grasslands on butterfly and moth communities. A positive effect of habitat connectivity was observed on total abundance of the declining butterflies and moths, sug-gesting that these species have strongest populations in well-connected habitat networks. Highest species richness and peak abundance of most individual species of butterflies and moths were generally observed in taller grassland vegetation compared with vascular plants, suggesting a preference towards less intensive management in insects. These differences between plants and their insect herbivores may be understood in the light of both (1) the higher structural diversity of tall vegetation and (2) weaker tolerance of disturbances by herbivorous insects due to their higher trophic level compared to plants. The ecological requirements of all species and species groups inhabiting semi-natural grasslands are probably never met at single restricted sites. Therefore, regional implementation of management to create differently managed areas is imperative for the conservation of different species and species groups dependent on semi-natural grasslands. With limited resources it might be reasonable to focus much of the management efforts in the densest networks of suitable habitat to minimise the risk of extinction of the declining species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Juvenile idiopathic arthritis (JIA) is a heterogeneous group of childhood chronic arthritides, associated with chronic uveitis in 20% of cases. For JIA patients responding inadequately to conventional disease-modifying anti-rheumatic drugs (DMARDs), biologic therapies, anti-tumor necrosis factor (anti-TNF) agents are available. In this retrospective multicenter study, 258 JIA-patients refractory to DMARDs and receiving biologic agents during 1999-2007 were included. Prior to initiation of anti-TNFs, growth velocity of 71 patients was delayed in 75% and normal in 25%. Those with delayed growth demonstrated a significant increase in growth velocity after initiation of anti-TNFs. Increase in growth rate was unrelated to pubertal growth spurt. No change was observed in skeletal maturation before and after anti-TNFs. The strongest predictor of change in growth velocity was growth rate prior to anti-TNFs. Change in inflammatory activity remained a significant predictor even after decrease in glucocorticoids was taken into account. In JIA-associated uveitis, impact of two first-line biologic agents, etanercept and infliximab, and second-line or third-line anti-TNF agent, adalimumab, was evaluated. In 108 refractory JIA patients receiving etanercept or infliximab, uveitis occurred in 45 (42%). Uveitis improved in 14 (31%), no change was observed in 14 (31%), and in 17 (38%) uveitis worsened. Uveitis improved more frequently (p=0.047) and frequency of annual uveitis flares was lower (p=0.015) in those on infliximab than in those on etanercept. In 20 patients taking adalimumab, 19 (95%) had previously failed etanercept and/or infliximab. In 7 patients (35%) uveitis improved, in one (5%) worsened, and in 12 (60%) no change occurred. Those with improved uveitis were younger and had shorter disease duration. Serious adverse events (AEs) or side-effects were not observed. Adalimumab was effective also in arthritis. Long-term drug survival (i.e. continuation rate on drug) with etanercept (n=105) vs. infliximab (n=104) was at 24 months 68% vs. 68%, and at 48 months 61% vs. 48% (p=0.194 in log-rank analysis). First-line anti-TNF agent was discontinued either due to inefficacy (etanercept 28% vs. infliximab 20%, p=0.445), AEs (7% vs. 22%, p=0.002), or inactive disease (10% vs. 16%, p=0.068). Females, patients with systemic JIA (sJIA), and those taking infliximab as the first therapy were at higher risk for treatment discontinuation. One-third switched to the second anti-TNF agent, which was discontinued less often than the first. In conclusion, in refractory JIA anti-TNFs induced enhanced growth velocity. Four-year treatment survival was comparable between etanercept and infliximab, and switching from first-line to second-line agent a reasonable therapeutic option. During anti-TNF treatment, one-third with JIA-associated anterior uveitis improved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acute knee injury is a common event throughout life, and it is usually the result of a traffic accident, simple fall, or twisting injury. Over 90% of patients with acute knee injury undergo radiography. An overlooked fracture or delayed diagnosis can lead to poor patient outcome. The major aim of this thesis was retrospectively to study imaging of knee injury with a special focus on tibial plateau fractures in patients referred to a level-one trauma center. Multi-detector computed tomography (MDCT) findings of acute knee trauma were studied and compared to radiography, as well as whether non-contrast MDCT can detect cruciate ligaments with reasonable accuracy. The prevalence, type, and location of meniscal injuries in magnetic resonance imaging (MRI) were evaluated, particularly in order to assess the prevalence of unstable meniscal tears in acute knee trauma with tibial plateau fractures. The possibility to analyze with conventional MRI the signal appearance of menisci repaired with bioabsorbable arrows was also studied. The postoperative use of MDCT was studied in surgically treated tibial plateau fractures: to establish the frequency and indications of MDCT and to assess the common findings and their clinical impact in a level-one trauma hospital. This thesis focused on MDCT and MRI of knee injuries, and radiographs were analyzed when applica-ble. Radiography constitutes the basis for imaging acute knee injury, but MDCT can yield information beyond the capabilities of radiography. Especially in severely injured patients , sufficient radiographs are often difficult to obtain, and in those patients, radiography is unreliable to rule out fractures. MDCT detected intact cruciate ligaments with good specificity, accuracy, and negative predictive value, but the assessment of torn ligaments was unreliable. A total of 36% (14/39) patients with tibial plateau fracture had an unstable meniscal tear in MRI. When a meniscal tear is properly detected preoperatively, treatment can be combined with primary fracture fixation, thus avoiding another operation. The number of meniscal contusions was high. Awareness of the imaging features of this meniscal abnormality can help radiologists increase specificity by avoiding false-positive findings in meniscal tears. Postoperative menisci treated with bioabsorbable arrows showed no difference, among different signal intensities in MRI, among menisci between patients with operated or intact ACL. The highest incidence of menisci with an increased signal intensity extending to the meniscal surface was in patients whose surgery was within the previous 18 months. The results may indicate that a rather long time is necessary for menisci to heal completely after arrow repair. Whether the menisci with an increased signal intensity extending to the meniscal surface represent improper healing or re-tear, or whether this is just the earlier healing feature in the natural process remains unclear, and further prospective studies are needed to clarify this. Postoperative use of MDCT in tibial plateau fractures was rather infrequent even in this large trauma center, but when performed, it revealed clinically significant information, thus benefitting patients in regard to treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airway inflammation is a key feature of bronchial asthma. In asthma management, according to international guidelines, the gold standard is anti-inflammatory treatment. Currently, only conventional procedures (i.e., symptoms, use of rescue medication, PEF-variability, and lung function tests) were used to both diagnose and evaluate the results of treatment with anti-inflammatory drugs. New methods for evaluation of degree of airway inflammation are required. Nitric oxide (NO) is a gas which is produced in the airways of healthy subjects and especially produced in asthmatic airways. Measurement of NO from the airways is possible, and NO can be measured from exhaled air. Fractional exhaled NO (FENO) is increased in asthma, and the highest concentrations are measured in asthmatic patients not treated with inhaled corticosteroids (ICS). Steroid-treated patients with asthma had levels of FENO similar to those of healthy controls. Atopic asthmatics had higher levels of FENO than did nonatopic asthmatics, indicating that level of atopy affected FENO level. Associations between FENO and bronchial hyperresponsiveness (BHR) occur in asthma. The present study demonstrated that measurement of FENO had good reproducibility, and the FENO variability was reasonable both short- and long-term in both healthy subjects and patients with respiratory symptoms or asthma. We demonstrated the upper normal limit for healthy subjects, which was 12 ppb calculated from two different healthy study populations. We showed that patients with respiratory symptoms who did not fulfil the diagnostic criteria of asthma had FENO values significantly higher than in healthy subjects, but significantly lower than in asthma patients. These findings suggest that BHR to histamine is a sensitive indicator of the effect of ICS and a valuable tool for adjustment of corticosteroid treatment in mild asthma. The findings further suggest that intermittent treatment periods of a few weeks’ duration are insufficient to provide long-term control of BHR in patients with mild persistent asthma. Moreover, during the treatment with ICS changes in BHR and changes in FENO were associated. FENO level was associated with BHR measured by a direct (histamine challenge) or indirect method (exercise challenge) in steroid-naïve symptomatic, non-smoking asthmatics. Although these associations could be found only in atopics, FENO level in nonatopic asthma was also increased. It can thus be concluded that assessment of airway inflammation by measuring FENO can be useful for clinical purposes. The methodology of FENO measurements is now validated. Especially in those patients with respiratory symptoms who did not fulfil the diagnostic criteria of asthma, FENO measurement can aid in treatment decisions. Serial measurement of FENO during treatment with ICS can be a complementary or an alternative method for evaluation in patients with asthma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Differentiation of various types of soft tissues is of high importance in medical imaging, because changes in soft tissue structure are often associated with pathologies, such as cancer. However, the densities of different soft tissues may be very similar, making it difficult to distinguish them in absorption images. This is especially true when the consideration of patient dose limits the available signal-to-noise ratio. Refraction is more sensitive than absorption to changes in the density, and small angle x-ray scattering on the other hand contains information about the macromolecular structure of the tissues. Both of these can be used as potential sources of contrast when soft tissues are imaged, but little is known about the visibility of the signals in realistic imaging situations. In this work the visibility of small-angle scattering and refraction in the context of medical imaging has been studied using computational methods. The work focuses on the study of analyzer based imaging, where the information about the sample is recorded in the rocking curve of the analyzer crystal. Computational phantoms based on simple geometrical shapes with differing material properties are used. The objects have realistic dimensions and attenuation properties that could be encountered in real imaging situations. The scattering properties mimic various features of measured small-angle scattering curves. Ray-tracing methods are used to calculate the refraction and attenuation of the beam, and a scattering halo is accumulated, including the effect of multiple scattering. The changes in the shape of the rocking curve are analyzed with different methods, including diffraction enhanced imaging (DEI), extended DEI (E-DEI) and multiple image radiography (MIR). A wide angle DEI, called W-DEI, is introduced and its performance is compared with that of the established methods. The results indicate that the differences in scattered intensities from healthy and malignant breast tissues are distinguishable to some extent with reasonable dose. Especially the fraction of total scattering has large enough differences that it can serve as a useful source of contrast. The peaks related to the macromolecular structure come to angles that are rather large, and have intensities that are only a small fraction of the total scattered intensity. It is found that such peaks seem to have only limited usefulness in medical imaging. It is also found that W-DEI performs rather well when most of the intensity remains in the direct beam, indicating that dark field imaging methods may produce the best results when scattering is weak. Altogether, it is found that the analysis of scattered intensity is a viable option even in medical imaging where the patient dose is the limiting factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Antarctic system comprises of the continent itself, Antarctica, and the ocean surrounding it, the Southern Ocean. The system has an important part in the global climate due to its size, its high latitude location and the negative radiation balance of its large ice sheets. Antarctica has also been in focus for several decades due to increased ultraviolet (UV) levels caused by stratospheric ozone depletion, and the disintegration of its ice shelves. In this study, measurements were made during three Austral summers to study the optical properties of the Antarctic system and to produce radiation information for additional modeling studies. These are related to specific phenomena found in the system. During the summer of 1997-1998, measurements of beam absorption and beam attenuation coefficients, and downwelling and upwelling irradiance were made in the Southern Ocean along a S-N transect at 6°E. The attenuation of photosynthetically active radiation (PAR) was calculated and used together with hydrographic measurements to judge whether the phytoplankton in the investigated areas of the Southern Ocean are light limited. By using the Kirk formula the diffuse attenuation coefficient was linked to the absorption and scattering coefficients. The diffuse attenuation coefficients (Kpar) for PAR were found to vary between 0.03 and 0.09 1/m. Using the values for KPAR and the definition of the Sverdrup critical depth, the studied Southern Ocean plankton systems were found not to be light limited. Variabilities in the spectral and total albedo of snow were studied in the Queen Maud Land region of Antarctica during the summers of 1999-2000 and 2000-2001. The measurement areas were the vicinity of the South African Antarctic research station SANAE 4, and a traverse near the Finnish Antarctic research station Aboa. The midday mean total albedos for snow were between 0.83, for clear skies, and 0.86, for overcast skies, at Aboa and between 0.81 and 0.83 for SANAE 4. The mean spectral albedo levels at Aboa and SANAE 4 were very close to each other. The variations in the spectral albedos were due more to differences in ambient conditions than variations in snow properties. A Monte-Carlo model was developed to study the spectral albedo and to develop a novel nondestructive method to measure the diffuse attenuation coefficient of snow. The method was based on the decay of upwelling radiation moving horizontally away from a source of downwelling light. This was assumed to have a relation to the diffuse attenuation coefficient. In the model, the attenuation coefficient obtained from the upwelling irradiance was higher than that obtained using vertical profiles of downwelling irradiance. The model results were compared to field measurements made on dry snow in Finnish Lapland and they correlated reasonably well. Low-elevation (below 1000 m) blue-ice areas may experience substantial melt-freeze cycles due to absorbed solar radiation and the small heat conductivity in the ice. A two-dimensional (x-z) model has been developed to simulate the formation and water circulation in the subsurface ponds. The model results show that for a physically reasonable parameter set the formation of liquid water within the ice can be reproduced. The results however are sensitive to the chosen parameter values, and their exact values are not well known. Vertical convection and a weak overturning circulation is generated stratifying the fluid and transporting warmer water downward, thereby causing additional melting at the base of the pond. In a 50-year integration, a global warming scenario mimicked by a decadal scale increase of 3 degrees per 100 years in air temperature, leads to a general increase in subsurface water volume. The ice did not disintegrate due to the air temperature increase after the 50 year integration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The smoke and fumes of the city: Air protection in Helsinki from 1945 to 1982 This dissertation examines air pollution and air protection in post-war Helsinki. The period studied ends in 1982 when the Air Protection Act entered into force, thus institutionalising air protection in Finland as a socially governed environmental matter. The dissertation is based on the research traditions of environmental politics and urban environmental history. The development of air protection is approached from the perspectives of politicisation and institutionalisation. The dissertation also investigates how air pollution grew into a social issue and presents various discursive ways of analysing air pollution and protection. The primary research material consists of municipal documents and newspapers, while supplementary material includes journal articles and interviews. The event history of air protection is described through an analysis of the material, including source criticism. The social ways of dealing with air pollution and the emergence of air protection are analysed in the light of case-specific air quality disputes from both factual and discursive perspectives. This approach enables the contextualisation of the development of air protection as part of the local history of post-war Helsinki. The dissertation presents the major sources of air pollution in Helsinki and describes the deterioration of air quality in a society which emphasised the primacy of economic prosperity. The air issue emerged during the 1950s in neighbourhood disputes and was exacerbated into a larger problem in the late 1960s. Concurrent to the formation of the field of environmental protection in Finland, an air protection organisation was established in the 1970s in Helsinki. As a result, air protection became a regular part of municipal government. Air protection in Helsinki developed from small-scale policies focused on individual cases into a large, institutionalised air protection system managed by experts. The dissertation research material gave rise to the following major research themes: the economic dimension of the air issue, the role of science in the formation of the environmental problem, and the establishment of norms for acceptable air quality and reasonable limits to air pollution in the urban environment. The paper also discusses the inequitable distribution of the negative effects of air pollution between the residents of different districts. The dissertation concludes that air protection in Helsinki became a local success story although it was long marred by inefficiency and partial failure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielmassa tarkastellaan suomalaista kulutuskulttuuria ja sen muuttumista kulutuskerronnan kautta. Aineisto koostuu 39 iäkkään, 1920–1950-luvuilla syntyneen, suomalaisen kuluttajaelämäkerroista, jotka kerättiin kirjoituskilpailulla. Tutkielmassa analysoidaan informanttien kulutukseen ja rahankäyttöön liittämiä hyveitä sekä sitä, millaisista kulutuseetoksista hyveet kertovat. Elämäkertojen erittelyyn ja tulkintaan on sovellettu lähiluentaa. Analyysin avulla on rakennettu tulkinta keskiluokkaisesta kulutuseetoksesta. Tutkimuksen teoreettinen viitekehys pohjautuu kulutuksen normatiivisia merkityksiä, kulutuseetoksia ja keskiluokkaisuutta käsitteleviin tutkimuksiin. Tulkintaa ohjaa lisäksi ymmärrys tutkittavan sukupolven elämänkulusta suomalaisen kulutusyhteiskunnan kehityksen näkökulmasta. Tutkittavan sukupolven elämän aikana kotitalous on kehittynyt agraarisen yhteiskunnan omavaraisesta tuotantoyksiköstä vauraan yhteiskunnan kulutukseen ja sosiaaliseen uusintamiseen keskittyväksi instituutioksi. Palkkatyöläisyys, vapaa-aika ja kulutusmahdollisuudet ovat lisääntyneet, ja yhteiskuntaa leimaa keskiluokkaistuminen. Elämäkerroista löytyvät säästäväisyyden ja vaatimattomuuden hyveet kertovat talonpoikaisen kulutueetoksen olevan edelleen keskeinen kulutuseetos, mutta niukkuuden hyveellistämisen lisäksi kulutuskerronnasta löytyy myös modernimpia tapoja suhtautua kulutukseen. Tulkitsen kuluttajaelämäkerroista löytyvien järkevyyden, tavallisuuden ja työnteon hyveiden kertovan keskiluokkaisuudesta. Hyveellinen kuluttaminen on keskiluokkaisessa kulutuseetoksessa talonpoikaista kulutuseetosta sallivampaa. Se määrittyy niukan kuluttamisen sijaan kohtuullisen ja tavallisen kuluttamisen hyveellistämiseksi. Keskiluokkaisessa kulutuseetoksessa on hyväksyttävää nauttia kohtuudella ja järkevästi omalla työllä ansaitusta vaurastumisesta. Talonpoikaisesta kulutuseetoksesta poiketen keskiluokkainen kulutuseetos hyväksyy kulutuksesta saatavan nautinnon. Sanonta ”ensin työ, sitten huvi” kuvaa keskiluokkaisen kulutuseetoksen tapaa suhtautua vaurastumisen mukanaan tuomaan kulutuskulttuuriin ja sen nautintoihin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inter-enterprise collaboration has become essential for the success of enterprises. As competition increasingly takes place between supply chains and networks of enterprises, there is a strategic business need to participate in multiple collaborations simultaneously. Collaborations based on an open market of autonomous actors set special requirements for computing facilities supporting the setup and management of these business networks of enterprises. Currently, the safeguards against privacy threats in collaborations crossing organizational borders are both insufficient and incompatible to the open market. A broader understanding is needed of the architecture of defense structures, and privacy threats must be detected not only on the level of a private person or enterprise, but on the community and ecosystem levels as well. Control measures must be automated wherever possible in order to keep the cost and effort of collaboration management reasonable. This article contributes to the understanding of the modern inter-enterprise collaboration environment and privacy threats in it, and presents the automated control measures required to ensure that actors in inter-enterprise collaborations behave correctly to preserve privacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.