23 resultados para finite-time attractiveness in probability
Resumo:
This thesis explores the relationship between humans and ICTs (information and communication technologies). As ICTs are increasingly penetrating all spheres of social life, their role as mediators – between people, between people and information, and even between people and the natural world – is expanding, and they are increasingly shaping social life. Yet, we still know little of how our life is affected by their growing role. Our understanding of the actors and forces driving the accelerating adoption of new ICTs in all areas of life is also fairly limited. This thesis addresses these problems by interpretively exploring the link between ICTs and the shaping of society at home, in the office, and in the community. The thesis builds on empirical material gathered in three research projects, presented in four separate essays. The first project explores computerized office work through a case study. The second is a regional development project aiming at increasing ICT knowledge and use in 50 small-town families. In the third, the second project is compared to three other longitudinal development projects funded by the European Union. Using theories that consider the human-ICT relationship as intertwined, the thesis provides a multifaceted description of life with ICTs in contemporary information society. By oscillating between empirical and theoretical investigations and balancing between determinist and constructivist conceptualisations of the human-ICT relationship, I construct a dialectical theoretical framework that can be used for studying socio-technical contexts in society. This framework helps us see how societal change stems from the complex social processes that surround routine everyday actions. For example, interacting with and through ICTs may change individuals’ perceptions of time and space, social roles, and the proper ways to communicate – changes which at some point in time result in societal change in terms of, for example, new ways of acting and knowing things.
Resumo:
Images and brands have been topics of great interest in both academia and practice for a long time. The company’s image, which in this study is considered equivalent to the actual corporate brand, has become a strategic issue and one of the company’s most valuable assets. In contrast to mainstream corporate branding research focusing on consumerimages as steered and managed by the company, in the present study a genuine consumer-focus is taken. The question is asked: how do consumers perceive the company, and especially, how are their experiences of the company over time reflected in the corporate image? The findings indicate that consumers’ corporate images can be seen as being constructed through dynamic relational processes based on a multifaceted network of earlier images from multiple sources over time. The essential finding is that corporate images have a heritage. In the thesis, the concept of image heritage is introduced, which stands for the consumer’s earlier company-related experiences from multiple sources over time. In other words, consumers construct their images of the company based on earlier recalled images, perhaps dating back many years in time. Therefore, corporate images have roots - an image heritage – on which the images are constructed in the present. For companies, image heritage is a key for understanding consumers, and thereby also a key for consumer-focused branding strategies and activities. As image heritage is the consumer’s interpretation base and context for image constructions here and now, branding strategies and activities that meet this consumer-reality has a potential to become more effective. This thesis is positioned in the tradition of The Nordic School of Marketing Thought and introduces a relational dynamic perspective into branding through consumers’ image heritage. Anne Rindell is associated to CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration.
Resumo:
Candida species are an important cause of nosocomial bloodstream infections in hospitalized patients worldwide, with associated high mortality, excess length of stay and costs. Main contributors to candidemias is profound immunosuppression due to serious underlying condition or intensive treatments leading to an increasing number of susceptible patients. The rank order of causative Candida species varies over time and in different geographic locations. The aim of this study was to obtain information on epidemiology of candidemia in Finland, to identify trends in incidence, causative species, and patient populations at risk. In order to reveal possible outbreaks and assess the value of one molecular typing method, restriction enzyme analysis (REA), in epidemiological study, we analyzed C. albicans bloodstream isolates in Uusimaa region in Southern Finland during eight years. The data from the National Infectious Disease Register were used to assess the incidence and epidemiological features of candidemia cases. In Helsinki University Central Hospital (HUCH) all patients with blood culture yielding any Candida spp. were identified from laboratory log-books and from Finnish Hospital Infection Program. All the patients with a stored blood culture isolate of C. albicans were identified through microbiology laboratory logbooks, and stored isolates were genotyped with REA in the National Institute for Health and Welfare (former KTL). The incidence of candidemia in Finland is globally relatively low, but increased between between 1990s and 2000s. The incidence was highest in males >65 years of age, but incidence rates for patients <1-15 years were lower during 2000s than during 1990s. In HUCH the incidence of candidemia remained low and constant during our 18 years of observation, but a significant shift in patient-populations at risk was observed, associated with patients treated in intensive care units, such as premature neonates and surgical patients. The predominating causative species in Finland and in HUCH is C. albicans, but the proportion of C. glabrata increased considerably. The crude one-month case fatality was constantly high between 28-33%. REA differentiated efficiently between C. albicans blood culture isolates and no clusters were observed in the hospitals involved, despite of abundant transfer of patients among them. Candida spp. are an important cause of nosocomial blood stream infections in Finland, and continued surveillance is necessary to determine the overall trends and patient groups at risk, and reduce the impact of these infections in the future. Molecular methods provide an efficient tool for investigation of suspected outbreak and should be available in the future in Finland, also.
Resumo:
The aim of this dissertation is to model economic variables by a mixture autoregressive (MAR) model. The MAR model is a generalization of linear autoregressive (AR) model. The MAR -model consists of K linear autoregressive components. At any given point of time one of these autoregressive components is randomly selected to generate a new observation for the time series. The mixture probability can be constant over time or a direct function of a some observable variable. Many economic time series contain properties which cannot be described by linear and stationary time series models. A nonlinear autoregressive model such as MAR model can a plausible alternative in the case of these time series. In this dissertation the MAR model is used to model stock market bubbles and a relationship between inflation and the interest rate. In the case of the inflation rate we arrived at the MAR model where inflation process is less mean reverting in the case of high inflation than in the case of normal inflation. The interest rate move one-for-one with expected inflation. We use the data from the Livingston survey as a proxy for inflation expectations. We have found that survey inflation expectations are not perfectly rational. According to our results information stickiness play an important role in the expectation formation. We also found that survey participants have a tendency to underestimate inflation. A MAR model has also used to model stock market bubbles and crashes. This model has two regimes: the bubble regime and the error correction regime. In the error correction regime price depends on a fundamental factor, the price-dividend ratio, and in the bubble regime, price is independent of fundamentals. In this model a stock market crash is usually caused by a regime switch from a bubble regime to an error-correction regime. According to our empirical results bubbles are related to a low inflation. Our model also imply that bubbles have influences investment return distribution in both short and long run.
Resumo:
Drug induced liver injury is one of the frequent reasons for the drug removal from the market. During the recent years there has been a pressure to develop more cost efficient, faster and easier ways to investigate drug-induced toxicity in order to recognize hepatotoxic drugs in the earlier phases of drug development. High Content Screening (HCS) instrument is an automated microscope equipped with image analysis software. It makes the image analysis faster and decreases the risk for an error caused by a person by analyzing the images always in the same way. Because the amount of drug and time needed in the analysis are smaller and multiple parameters can be analyzed from the same cells, the method should be more sensitive, effective and cheaper than the conventional assays in cytotoxicity testing. Liver cells are rich in mitochondria and many drugs target their toxicity to hepatocyte mitochondria. Mitochondria produce the majority of the ATP in the cell through oxidative phosphorylation. They maintain biochemical homeostasis in the cell and participate in cell death. Mitochondria is divided into two compartments by inner and outer mitochondrial membranes. The oxidative phosphorylation happens in the inner mitochondrial membrane. A part of the respiratory chain, a protein called cytochrome c, activates caspase cascades when released. This leads to apoptosis. The aim of this study was to implement, optimize and compare mitochondrial toxicity HCS assays in live cells and fixed cells in two cellular models: human HepG2 hepatoma cell line and rat primary hepatocytes. Three different hepato- and mitochondriatoxic drugs (staurosporine, rotenone and tolcapone) were used. Cells were treated with the drugs, incubated with the fluorescent probes and then the images were analyzed using Cellomics ArrayScan VTI reader. Finally the results obtained after optimizing methods were compared to each other and to the results of the conventional cytotoxicity assays, ATP and LDH measurements. After optimization the live cell method and rat primary hepatocytes were selected to be used in the experiments. Staurosporine was the most toxic of the three drugs and caused most damage to the cells most quickly. Rotenone was not that toxic, but the results were more reproducible and thus it would serve as a good positive control in the screening. Tolcapone was the least toxic. So far the conventional analysis of cytotoxicity worked better than the HCS methods. More optimization needs to be done to get the HCS method more sensitive. This was not possible in this study due to time limit.
Resumo:
This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating–dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating–dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs – these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating–dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.
Resumo:
We use parallel weighted finite-state transducers to implement a part-of-speech tagger, which obtains state-of-the-art accuracy when used to tag the Europarl corpora for Finnish, Swedish and English. Our system consists of a weighted lexicon and a guesser combined with a bigram model factored into two weighted transducers. We use both lemmas and tag sequences in the bigram model, which guarantees reliable bigram estimates.
Resumo:
There are numerous formats for writing spellcheckers for open-source systems and there are many descriptions for languages written in these formats. Similarly, for word hyphenation by computer there are TEX rules for many languages. In this paper we demonstrate a method for converting these spell-checking lexicons and hyphenation rule sets into finite-state automata, and present a new finite-state based system for writer’s tools used in current open-source software such as Firefox, OpenOffice.org and enchant via the spell-checking library voikko.