16 resultados para Application time

em Helda - Digital Repository of University of Helsinki


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to evaluate the feasibility of pit and fissure sealants and the effectiveness of the two sealant methods applied in every-day practice in public dental health care in Finland. Two sealant methods were evaluated according to their effectiveness in preventing dentin caries and sealant retention. Application time with these sealant methods was compared. The survival rate of sealed first and second molars was followed for nine and 13 year periods, respectively. Caries risk evaluation and observed increased caries risk were the basis for considering sealant application. A questionnaire, sent to all public dental health centers in Finland, monitored the attitudes of the dental profession towards sealant application and explored the current policies used as well as changes noted in the sealant application protocol. DMFT (Decayed, Missing or Filled Teeth) index values collected from the health centers were evaluated. The difference in caries rate between two methods investigated was highly significant. When compared to the glass ionomer sealant method (GIC), the effectiveness of the resin-based method (RB) in preventing dentin caries was 74% and the rate difference 3%. The relative risk for RB-sealed surfaces vs. GIC-sealed surfaces of having detectable dentin caries was 0.3 (95% CI 0.12, 0.57). The retention rate of sealants was higher with RB than GIC (P<0.001). Application of RB sealant material was less time-consuming than application of GIC sealant. Occlusal dentin caries lesions were found in 4% and proximal caries in less than 2% of sealed teeth. The majority of respondents reported application of sealants on a systematic basis along with caries-risk evaluation. Those health centers sealing over suspected or detected enamel caries had lower average DMFT index values (1.0) when compared to DMFT values (1.2) of health centers applying sealants by alternative criteria. It is concluded that the RB sealant method is more effective than the GIC sealant method in preventing dentin caries. Sealant maintenance may increase the costs of a sealant program. Occlusal caries management may be improved if the applied sealant policies are changed towards an interceptive approach i.e. applying the sealants over detected or suspected enamel caries lesions instead of sealing sound teeth in a preventive manner.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug Analysis without Primary Reference Standards: Application of LC-TOFMS and LC-CLND to Biofluids and Seized Material Primary reference standards for new drugs, metabolites, designer drugs or rare substances may not be obtainable within a reasonable period of time or their availability may also be hindered by extensive administrative requirements. Standards are usually costly and may have a limited shelf life. Finally, many compounds are not available commercially and sometimes not at all. A new approach within forensic and clinical drug analysis involves substance identification based on accurate mass measurement by liquid chromatography coupled with time-of-flight mass spectrometry (LC-TOFMS) and quantification by LC coupled with chemiluminescence nitrogen detection (LC-CLND) possessing equimolar response to nitrogen. Formula-based identification relies on the fact that the accurate mass of an ion from a chemical compound corresponds to the elemental composition of that compound. Single-calibrant nitrogen based quantification is feasible with a nitrogen-specific detector since approximately 90% of drugs contain nitrogen. A method was developed for toxicological drug screening in 1 ml urine samples by LC-TOFMS. A large target database of exact monoisotopic masses was constructed, representing the elemental formulae of reference drugs and their metabolites. Identification was based on matching the sample component s measured parameters with those in the database, including accurate mass and retention time, if available. In addition, an algorithm for isotopic pattern match (SigmaFit) was applied. Differences in ion abundance in urine extracts did not affect the mass accuracy or the SigmaFit values. For routine screening practice, a mass tolerance of 10 ppm and a SigmaFit tolerance of 0.03 were established. Seized street drug samples were analysed instantly by LC-TOFMS and LC-CLND, using a dilute and shoot approach. In the quantitative analysis of amphetamine, heroin and cocaine findings, the mean relative difference between the results of LC-CLND and the reference methods was only 11%. In blood specimens, liquid-liquid extraction recoveries for basic lipophilic drugs were first established and the validity of the generic extraction recovery-corrected single-calibrant LC-CLND was then verified with proficiency test samples. The mean accuracy was 24% and 17% for plasma and whole blood samples, respectively, all results falling within the confidence range of the reference concentrations. Further, metabolic ratios for the opioid drug tramadol were determined in a pharmacogenetic study setting. Extraction recovery estimation, based on model compounds with similar physicochemical characteristics, produced clinically feasible results without reference standards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple Perspectives on Networks: Conceptual Development, Application and Integration in an Entrepreneurial Context. The purpose of this thesis is to enhance cross-fertilization between three different approaches to network research. The business network approach may contribute in terms of how relationships are created, developed and how tie content changes within ties, not only between them. The social network approach adds to the discussion by offering concepts of structural change on a network level. The network approach in entrepreneurship contributes by emphasizing network content, governance and structure as a way of understanding and capturing networks. This is discussed in the conceptual articles, Articles 2 and 3. The ultimate purpose of this thesis is to develop a theoretical and empirical understanding of network development processes. This is fulfilled by presenting a theoretical framework, which offers multiple views on process as a developmental outcome. The framework implies that change ought to be captured both within and among relationships over time in the firm as well as in the network. Consequently, changes in structure and interaction taking place simultaneously need to be included when doing research on network development. The connection between micro and macro levels is also stressed. Therefore, the entrepreneur or firm level needs to be implemented together with the network level. The surrounding environment impacts firm and network development and vice versa and hence needs to be integrated. Further, it is necessary to view network development not only as a way forward but to include both progression and regression as inevitable parts of the process. Finally, both stability and change should be taken into account as part of network development. Empirical results in Article 1 show support for a positive impact of networks on SME internationalization. Article 4 compares networks of novice, serial and portfolio entrepreneurs but the empirical results show little support for differences in the networks by type of entrepreneur. The results demonstrate that network interaction and structure is not directly impacted by type of entrepreneur involved. It indicates instead that network structure and interaction is more impacted by the development phase of the firm. This in turn is in line with the theoretical implications, stating that the development of the network and the firm impacts each other, as they co-evolve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the emergence of service marketing, the focus of service research has evolved. Currently the focus of research is shifting towards value co-created by the customer. Consequently, value creation is increasingly less fixed to a specific time or location controlled by the service provider. However, present service management models, although acknowledging customer participation and accessibility, have not considered the role of the empowered customer who may perform the service at various locations and time frames. The present study expands this scope and provides a framework for exploring customer perceived value from a temporal and spatial perspective. The framework is used to understand and analyse customer perceived value and to explore customer value profiles. It is proposed that customer perceived value can be conceptualised as a function of technical, functional, temporal and spatial value dimensions. These dimensions are suggested to have value-increasing and value-decreasing facets. This conceptualisation is empirically explored in an online banking context and it is shown that time and location are more important value dimensions relative to the technical and functional dimensions. The findings demonstrate that time and location are important not only in terms of having the possibility to choose when and where the service is performed. Customers also value an efficient and optimised use of time and a private and customised service location. The study demonstrates that time and location are not external elements that form the service context, but service value dimensions, in addition to the technical and functional dimensions. This thesis contributes to existing service management research through its framework for understanding temporal and spatial dimensions of perceived value. Practical implications of the study are that time and location need to be considered as service design elements in order to differentiate the service from other services and create additional value for customers. Also, because of increased customer control and the importance of time and location, it is increasingly relevant for service providers to provide a facilitating arena for customers to create value, rather than trying to control the value creation process. Kristina Heinonen is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the effects of the Greeks of the options and the trading results of delta hedging strategies, with three different time units or option-pricing models. These time units were calendar time, trading time and continuous time using discrete approximation (CTDA) time. The CTDA time model is a pricing model, that among others accounts for intraday and weekend, patterns in volatility. For the CTDA time model some additional theta measures, which were believed to be usable in trading, were developed. The study appears to verify that there were differences in the Greeks with different time units. It also revealed that these differences influence the delta hedging of options or portfolios. Although it is difficult to say anything about which is the most usable of the different time models, as this much depends on the traders view of the passing of time, different market conditions and different portfolios, the CTDA time model can be viewed as an attractive alternative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the persistent pattern in the Helsinki Exchanges. The persistent pattern is analyzed using a time and a price approach. It is hypothesized that arrival times are related to movements in prices. Thus, the arrival times are defined as durations and formulated as an Autoregressive Conditional Duration (ACD) model as in Engle and Russell (1998). The prices are defined as price changes and formulated as a GARCH process including duration measures. The research question follows from market microstructure predictions about price intensities defined as time between price changes. The microstructure theory states that long transaction durations might be associated with both no news and bad news. Accordingly, short durations would be related to high volatility and long durations to low volatility. As a result, the spread will tend to be larger under intensive moments. The main findings of this study are 1) arrival times are positively autocorrelated and 2) long durations are associated with low volatility in the market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bootstrap likelihood ratio tests of cointegration rank are commonly used because they tend to have rejection probabilities that are closer to the nominal level than the rejection probabilities of the correspond- ing asymptotic tests. The e¤ect of bootstrapping the test on its power is largely unknown. We show that a new computationally inexpensive procedure can be applied to the estimation of the power function of the bootstrap test of cointegration rank. The bootstrap test is found to have a power function close to that of the level-adjusted asymp- totic test. The bootstrap test estimates the level-adjusted power of the asymptotic test highly accurately. The bootstrap test may have low power to reject the null hypothesis of cointegration rank zero, or underestimate the cointegration rank. An empirical application to Euribor interest rates is provided as an illustration of the findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Irritable bowel syndrome (IBS) is a common multifactorial functional intestinal disorder, the pathogenesis of which is not completely understood. Increasing scientific evidence suggests that microbes are involved in the onset and maintenance of IBS symptoms. The microbiota of the human gastrointestinal (GI) tract constitutes a massive and complex ecosystem consisting mainly of obligate anaerobic microorganisms making the use of culture-based methods demanding and prone to misinterpretation. To overcome these drawbacks, an extensive panel of species- and group-specific assays for an accurate quantification of bacteria from fecal samples with real-time PCR was developed, optimized, and validated. As a result, the target bacteria were detectable at a minimum concentration range of approximately 10 000 bacterial genomes per gram of fecal sample, which corresponds to the sensitivity to detect 0.000001% subpopulations of the total fecal microbiota. The real-time PCR panel covering both commensal and pathogenic microorganisms was assessed to compare the intestinal microbiota of patients suffering from IBS with a healthy control group devoid of GI symptoms. Both the IBS and control groups showed considerable individual variation in gut microbiota composition. Sorting of the IBS patients according to the symptom subtypes (diarrhea, constipation, and alternating predominant type) revealed that lower amounts of Lactobacillus spp. were present in the samples of diarrhea predominant IBS patients, whereas constipation predominant IBS patients carried increased amounts of Veillonella spp. In the screening of intestinal pathogens, 17% of IBS samples tested positive for Staphylococcus aureus, whereas no positive cases were discovered among healthy controls. Furthermore, the methodology was applied to monitor the effects of a multispecies probiotic supplementation on GI microbiota of IBS sufferers. In the placebo-controlled double-blind probiotic intervention trial of IBS patients, each supplemented probiotic strain was detected in fecal samples. Intestinal microbiota remained stable during the trial, except for Bifidobacterium spp., which increased in the placebo group and decreased in the probiotic group. The combination of assays developed and applied in this thesis has an overall coverage of 300-400 known bacterial species, along with the number of yet unknown phylotypes. Hence, it provides good means for studying the intestinal microbiota, irrespective of the intestinal condition and health status. In particular, it allows screening and identification of microbes putatively associated with IBS. The alterations in the gut microbiota discovered here support the hypothesis that microbes are likely to contribute to the pathophysiology of IBS. The central question is whether the microbiota changes described represent the cause for, rather than the effect of, disturbed gut physiology. Therefore, more studies are needed to determine the role and importance of individual microbial species or groups in IBS. In addition, it is essential that the microbial alterations observed in this study will be confirmed using a larger set of IBS samples of different subtypes, preferably from various geographical locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the possibility of extending the Quantization Condition of Dirac for Magnetic Monopoles to noncommutative space-time is investigated. The three publications that this thesis is based on are all in direct link to this investigation. Noncommutative solitons have been found within certain noncommutative field theories, but it is not known whether they possesses only topological charge or also magnetic charge. This is a consequence of that the noncommutative topological charge need not coincide with the noncommutative magnetic charge, although they are equivalent in the commutative context. The aim of this work is to begin to fill this gap of knowledge. The method of investigation is perturbative and leaves open the question of whether a nonperturbative source for the magnetic monopole can be constructed, although some aspects of such a generalization are indicated. The main result is that while the noncommutative Aharonov-Bohm effect can be formulated in a gauge invariant way, the quantization condition of Dirac is not satisfied in the case of a perturbative source for the point-like magnetic monopole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physical inactivity has become a major threat to public health worldwide. The Finnish health and welfare policies emphasize that the working population should maintain good health and functioning until their normal retirement age and remain in good health and independence later in life. Health behaviours like physical activity potentially play an important role in reaching this target as physical activity contributes to better physical fitness and to reduced risk of major chronic diseases. The aim of this study was to examine first whether the volume and intensity of leisure-time physical activity impacts on subsequent physical health functioning, sickness absence and disability retirement. The second aim was to examine changes in leisure-time physical activity of moderate and vigorous intensity after transition to retirement. This study is part of the ongoing Helsinki Health Study. The baseline data were collected by questionnaires in 2000 - 02 among the employees of the City of Helsinki aged 40 to 60. The follow-up survey data were collected in 2007. Data on sickness absence were obtained from the employer s (City of Helsinki) sickness absence registers and pension data were obtained from the Finnish Centre for Pensions. Leisure-time physical activity was measured in four grades of intensity and classified according to physical activity recommendations considering both the volume and intensity of physical activity. Statistical techniques including analysis of covariance, logistic regression, Cox proportional hazards models and Poisson regression were used. Employees who were vigorously active during leisure time especially had better physical health functioning than those physically inactive. High physical activity in particular contributed to the maintenance of good physical health functioning. High physical activity also reduced the risk of subsequent sickness absences as well as the risk of all-cause disability retirement and retirement due to musculoskeletal and mental causes. Among those transferred to old-age retirement moderate-intensity leisure-time physical activity increased on average by more than half an hour per week and in addition the occurrence of physical inactivity reduced. Such changes were not observed among those remained employed and those transferred to disability retirement. This prospective cohort study provided novel results on the effects of leisure-time physical activity on health related functioning and changes in leisure-time physical activity after retirement. Although the benefits of moderate-intensity physical activity for health are well known these results suggest the importance of vigorous physical activity for subsequent health related functioning. Thus vigorous physical activity to enhance fitness should be given more emphasis from a public health perspective. In addition, physical activity should be encouraged among those who are about to retire.