47 resultados para Value adding
em Helda - Digital Repository of University of Helsinki
Resumo:
This dissertation analyzes the interrelationship between death, the conditions of (wo)man s social being, and the notion of value as it emerges in the fiction of the American novelist Thomas Pynchon (1937 ). Pynchon s present work includes six novels V. (1963), The Crying of Lot 49 (1966), Gravity s Rainbow (1973), Vineland (1990), Mason & Dixon (1997), Against the Day (2006) and several short stories. Death constitues a central thematic in Pynchon s work, and it emerges through recurrent questions of mortality, suicide, mass destruction, sacrifice, afterlife, entropy, the relationship between the animate and the inanimate, and the limits of representation. In Pynchon, death is never a mere biological given (or event); it is always determined within a certain historical, cultural, and ideological context. Throughout his work, Pynchon questions the strict ontological separation of life and death by showing the relationship between this separation and social power. Conceptual divisions also reflect the relationship between society and its others, and death becomes that through which lines of social demarcation are articulated. Determined as a conceptual and social "other side", death in Pynchon forms a challenge to modern culture, and makes an unexpected return: the dead return to haunt the living, the inanimate and the animate fuse, and technoscientific attempts at overcoming and controlling death result in its re-emergence in mass destruction and ecological damage. The questioning of the ontological line also affects the structuration of Pynchon's prose, where the recurrent narrated and narrative desire to reach the limits of representation is openly associated with death. Textualized, death appears in Pynchon's writing as a sudden rupture within the textual functioning, when the "other side", that is, the bare materiality of the signifier is foregrounded. In this study, Pynchon s cultural criticism and his poetics come together, and I analyze the subversive role of death in his fiction through Jean Baudrillard s genealogy of the modern notion of death from L échange symbolique et la mort (1976). Baudrillard sees an intrinsic bond between the social repression of death in modernity and the emergence of modern political economy, and in his analysis economy and language appear as parallel systems for generating value (exchange value/ sign-value). For Baudrillard, the modern notion of death as negativity in relation to the positivity of life, and the fact that death cannot be given a proper meaning, betray an antagonistic relation between death and the notion of value. As a mode of negativity (that is, non-value), death becomes a moment of rupture in relation to value-based thinking in short, rationalism. Through this rupture emerges a form of thinking Baudrillard labels the symbolic, characterized by ambivalence and the subversion of conceptual opposites.
Resumo:
Diseases caused by the Lancefield group A streptococcus, Streptococcus pyogenes, are amongst the most challenging to clinicians and public health specialists alike. Although severe infections caused by S. pyogenes are relatively uncommon, affecting around 3 per 100,000 of the population per annum in developed countries, the case fatality is high relative to many other infections. Despite a long scientific tradition of studying their occurrence and characteristics, many aspects of their epidemiology remain poorly understood, and potential control measures undefined. Epidemiological studies can play an important role in identifying host, pathogen and environmental factors associated with risk of disease, manifestation of particular syndromes or poor survival. This can be of value in targeting prevention activities, as well directing further basic research, potentially paving the way for the identification of novel therapeutic targets. The formation of a European network, Strep-EURO, provided an opportunity to explore epidemiological patterns across Europe. Funded by the Fifth Framework Programme of the European Commission s Directorate-General for Research (QLK2.CT.2002.01398), the Strep-EURO network was launched in September 2002. Twelve participants across eleven countries took part, led by the University of Lund in Sweden. Cases were defined as patients with S. pyogenes isolated from a normally sterile site, or non-sterile site in combination with clinical signs of streptococcal toxic shock syndrome (STSS). All participating countries undertook prospective enhanced surveillance between 1st January 2003 and 31st December 2004 to identify cases diagnosed during this period. A standardised surveillance dataset was defined, comprising demographic, clinical and risk factor information collected through a questionnaire. Isolates were collected by the national reference laboratories and characterised according to their M protein using conventional serological and emm gene typing. Descriptive statistics and multivariable analyses were undertaken to compare characteristics of cases between countries and identify factors associated with increased risk of death or development of STSS. Crude and age-adjusted rates of infection were calculated for each country where a catchment population could be defined. The project succeeded in establishing the first European surveillance network for severe S. pyogenes infections, with 5522 cases identified over the two years. Analysis of data gathered in the eleven countries yielded important new information on the epidemiology of severe S. pyogenes infections in Europe during the 2000s. Comprehensive epidemiological data on these infections were obtained for the first time from France, Greece and Romania. Incidence estimates identified a general north-south gradient, from high to low. Remarkably similar age-standardised rates were observed among the three Nordic participants, between 2.2 and 2.3 per 100,000 population. Rates in the UK were higher still, 2.9/100,000, elevated by an upsurge in drug injectors. Rates from these northern countries were reasonably close to those observed in the USA and Australia during this period. In contrast, rates of reports in the more central and southern countries (Czech Republic, Romania, Cyprus and Italy) were substantially lower, 0.3 to 1.5 per 100,000 population, a likely reflection of poorer uptake of microbiological diagnostic methods within these countries. Analysis of project data brought some new insights into risk factors for severe S. pyogenes infection, especially the importance of injecting drug users in the UK, with infections in this group fundamentally reshaping the epidemiology of these infections during this period. Several novel findings arose through this work, including the high degree of congruence in seasonal patterns between countries and the seasonal changes in case fatality rates. Elderly patients, those with compromised immune systems, those who developed STSS and those infected with an emm/M78, emm/M5, emm/M3 or emm/M1 were found to be most likely to die as a result of their infection, whereas those diagnosed with cellulitis, septic arthritis, puerperal sepsis or with non-focal infection were associated with low risk of death, as were infections occurring during October. Analysis of augmented data from the UK found use of NSAIDs to be significantly associated with development of STSS, adding further fuel to the debate surrounding the role of NSAIDs in the development of severe disease. As a largely community-acquired infection, occurring sporadically and diffusely throughout the population, opportunities for control of severe infections caused by S. pyogenes remain limited, primarily involving contact chemoprophylaxis where clusters arise. Analysis of UK Strep-EURO data were used to quantify the risk to household contacts of cases, forming the basis of national guidance on the management of infection. Vaccines currently under development could offer a more effective control programme in future. Surveillance of invasive infections caused by S. pyogenes is of considerable public health importance as a means of identifying long and short-term trends in incidence, allowing the need for, or impact of, public health measures to be evaluated. As a dynamic pathogen co-existing among a dynamic population, new opportunities for exploitation of its human host are likely to arise periodically, and as such continued monitoring remains essential.
Composition operators, Aleksandrov measures and value distribution of analytic maps in the unit disc
Resumo:
A composition operator is a linear operator that precomposes any given function with another function, which is held fixed and called the symbol of the composition operator. This dissertation studies such operators and questions related to their theory in the case when the functions to be composed are analytic in the unit disc of the complex plane. Thus the subject of the dissertation lies at the intersection of analytic function theory and operator theory. The work contains three research articles. The first article is concerned with the value distribution of analytic functions. In the literature there are two different conditions which characterize when a composition operator is compact on the Hardy spaces of the unit disc. One condition is in terms of the classical Nevanlinna counting function, defined inside the disc, and the other condition involves a family of certain measures called the Aleksandrov (or Clark) measures and supported on the boundary of the disc. The article explains the connection between these two approaches from a function-theoretic point of view. It is shown that the Aleksandrov measures can be interpreted as kinds of boundary limits of the Nevanlinna counting function as one approaches the boundary from within the disc. The other two articles investigate the compactness properties of the difference of two composition operators, which is beneficial for understanding the structure of the set of all composition operators. The second article considers this question on the Hardy and related spaces of the disc, and employs Aleksandrov measures as its main tool. The results obtained generalize those existing for the case of a single composition operator. However, there are some peculiarities which do not occur in the theory of a single operator. The third article studies the compactness of the difference operator on the Bloch and Lipschitz spaces, improving and extending results given in the previous literature. Moreover, in this connection one obtains a general result which characterizes the compactness and weak compactness of the difference of two weighted composition operators on certain weighted Hardy-type spaces.
Resumo:
A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.
Resumo:
This thesis has two items: biofouling and antifouling in paper industry. Biofouling means unwanted microbial accumulation on surfaces causing e.g. disturbances in industrial processes, contamination of medical devices or of water distribution networks. Antifouling focuses on preventing accumulation of the biofilms in undesired places. Deinococcus geothermalis is a pink-pigmented, thermophilic bacterium, and extremely resistant towards radiation, UV-light and desiccation and known as a biofouler of paper machines forming firm and biocide resistant biofilms on the stainless steel surfaces. The compact structure of biofilm microcolonies of D. geothermalis E50051 and the adhesion into abiotic surfaces were investigated by confocal laser scanning microscope combined with carbohydrate specific fluorescently labelled lectins. The extracellular polymeric substance in D. geothermalis microcolonies was found to be a composite of at least five different glycoconjugates contributing to adhesion, functioning as structural elements, putative storages for water, gliding motility and likely also to protection. The adhesion threads that D. geothermalis seems to use to adhere on an abiotic surface and to anchor itself to the neighbouring cells were shown to be protein. Four protein components of type IV pilin were identified. In addition, the lectin staining showed that the adhesion threads were covered with galactose containing glycoconjugates. The threads were not exposed on planktic cells indicating their primary role in adhesion and in biofilm formation. I investigated by quantitative real-time PCR the presence of D. geothermalis in biofilms, deposits, process waters and paper end products from 24 paper and board mills. The primers designed for doing this were targeted to the 16S rRNA gene of D. geothermalis. We found D. geothermalis DNA from 9 machines, in total 16 samples of the 120 mill samples searched for. The total bacterial content varied in those samples between 107 to 3 ×1010 16S rRNA gene copies g-1. The proportion of D. geothermalis in those same samples was minor, 0.03 1.3 % of the total bacterial content. Nevertheless D. geothermalis may endanger paper quality as its DNA was shown in an end product. As an antifouling method towards biofilms we studied the electrochemical polarization. Two novel instruments were designed for this work. The double biofilm analyzer was designed for search for a polarization program that would eradicate D. geothermalis biofilm or from stainless steel under conditions simulating paper mill environment. The Radbox instrument was designed to study the generation of reactive oxygen species during the polarization that was effective in antifouling of D. geothermalis. We found that cathodic character and a pulsed mode of polarization were required to achieve detaching D. geothermalis biofilm from stainless steel. We also found that the efficiency of polarization was good on submerged, and poor on splash area biofilms. By adding oxidative biocides, bromochloro-5,5-dimethylhydantoin, 2,2-dibromo-2-cyanodiacetamide or peracetic acid gave additive value with polarization, being active on splash area biofilms. We showed that the cathodically weighted pulsed polarization that was active in removing D. geothermalis was also effective in generation of reactive oxygen species. It is possible that the antifouling effect relied on the generation of ROS on the polarized steel surfaces. Antifouling method successful towards D. geothermalis that is a tenacious biofouler and possesses a high tolerance to oxidative stressors could be functional also towards other biofoulers and applicable in wet industrial processes elsewhere.
Resumo:
A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.
Resumo:
ERP system implementations have evolved so rapidly that now they represent a must-have within industries. ERP systems are viewed as the cost of doing business. Yet, the research that adopted the resource-based view on the business value of ERP systems concludes that companies may gain competitive advantage when they successfully manage their ERP projects, when they carefully reengineer the organization and when they use the system in line with the organizational strategies. This thesis contributes to the literature on ERP business value by examining key drivers of ERP business value in organizations. The first research paper investigates how ERP systems with different degrees of system functionality are correlated with the development of the business performance after the completion of the ERP projects. The companies with a better perceived system functionality obtained efficiency benefits in the first two years of post-implementation. However, in the third year there is no significant difference in efficiency benefits between successfully and less successfully managed ERP projects. The second research paper examines what business process changes occur in companies implementing ERP for different motivations and how these changes impact the business performance. The findings show that companies reported process changes mainly in terms of workflow changes. In addition, the companies having a business-led motivation focused more on observing average costs of each increase in the input unit. Companies having a technological-led motivation focused more on the benefits coming from the fit of the system with the organizational processes. The third research paper considers the role of alignment between ERP and business strategies for the realization of business value from ERP use. These findings show that strategic alignment and business process changes are significantly correlated with the perceived benefits of ERP at three levels: internal efficiency, customers and financial. Overall, by combining quantitative and qualitative research methods, this thesis puts forward a model that illustrates how successfully managed ERP projects, aligned with the business strategy, have automate and informate effects on processes that ultimately improve the customer service and reduce the companies’ costs.
Resumo:
Many service management studies have suggested that service providers benefit from having long-term relationships with customers, but the argument from a customer perspective has been vague. However, especially in the business-to-business context, an analysis of financial value creation seems appropriate also from a customer perspective. Hence, the aim of this study is to develop a framework for understanding monetary value creation in professional service assignments from a customer perspective. The contribution of this study is an improved insight and framework for understanding financial value creation from a customer perspective in a professional service delivery process. The sources for monetary differences between transactional and long-term service providers are identified and quantified in case settings. This study contributes to the existing literature in service and relationship management by extending the customer’s viewpoint from perceived value to measurable monetary value. The contribution to the professional services lies in the process focus as opposed to the outcome focus, which is often accentuated in the existing professional services literature. The findings from the qualitative data suggest that a customer company may benefit from having an improved understanding of the service delivery (service assignment) process and the factors affecting the monetary value creation during the process. It is suggested that long-term relationships with service providers create financial value in the case settings in the short term. The findings also indicate that by using the improved understanding, a customer company can make more informed decisions when selecting a service provider for a specific assignment. Mirel Leino is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration
Resumo:
During past years, we have witnessed the widespread use of websites in communication in business-to-business relationships. If developed appropriately, such communication can result in numerous positive implications for business relationships, amplifying the importance of designing website communication that meet customer needs. In doing that, an understanding of value of website communication for customers is crucial. The study develops a theoretical framework of customer value of website communication in business-to-business relationships. Theoretically, the study builds on the interaction approach to industrial marketing, different approaches to customer value and inter-organisational communication theory. The empirical part involves a case study with a seller and nine different customer companies in the elevator industry. The data collection encompasses interviews and observations of representatives from the customer companies, interviews with the seller and an analysis of various reports of the seller. The continuous iteration between the theory and the case study resulted in the integrated approach to customer value and in the holistic theoretical framework of customer value of website communication in business-to-business relationships. The framework incorporates and elicits meanings of different components of customer value: website communication characteristics that act as drivers of customer value, customer consequences – both benefits and sacrifices, customer end-states as the final goals that lead customer actions, and different types of linkages between these components. Compared to extant research on customer value, the study offers a more holistic framework of customer value that depicts its complexity and richness. In addition, it portrays customer value in the neglected context of website communication. The findings of the study can be used as tools in any analysis of customer value. They are also of relevance in designing appropriate website communication as well as in developing effective website communication strategies. Nataša Golik Klanac is associated with the Centre for Relationship Marketing and Service Management (CERS) at Hanken.
Resumo:
This thesis introduces a practice-theoretical approach to understanding customer value formation to be used in the field of service marketing and management. In contrast to current studies trying to understand value formation by analysing customers as independent actors and thinkers, it is in this work suggested that customer value formation can be better understood by analysing how value is formed in the practices and contexts of the customers. The theoretical approach developed in this thesis is applied in an empirical study of family cruises. The theoretical analysis in this thesis results in a new approach for understanding customer value formation. Customer value is, according to this new approach, something that is formed in practice, meaning that value is formed in constellations of the customer and contextual elements like tools, physical spaces and contextually embedded images and know-how. This view is different from the current views that tend to see value as subjectively created, co-created, perceived or experienced by the customer. The new approach has implications on how we view customer value, but also on the methods and techniques we can use to understand customer value in empirical studies. It is also suggested that services could in fact be reconceptualised as practices. According to the stance presented in this thesis the empirical analysis of customer value should not focus on individual customers, but should instead take the contextual entity of practices as its unit of analysis. Therefore, ethnography is chosen as a method for exploring how customer value is formed in practice in the case of family cruises on a specific cruise vessel. The researcher has studied six families, as well as the context of the cruise vessel with various techniques including non-participant observation, participant observation and interviews in order to create an ethnographic understanding of the practices carried out on board. Twenty-one different practices are reported and discussed in order to provide necessary insight to customer value formation that can be used as input for service development.