14 resultados para Reliability benefits

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, a quality assessment method based on sampling of primary laser inventory units (microsegments) was analysed. The accuracy of a laser inventory carried out in Kuhmo was analysed as a case study. Field sample plots were measured on the sampled microsegments in the Kuhmo inventory area. Two main questions were considered. Did the ALS based inventory meet the accuracy requirements set for the provider and how should a reliable, cost-efficient and independent quality assessment be undertaken. The agreement between control measurement and ALS based inventory was analysed in four ways: 1) The root mean squared errors (RMSEs) and bias were calculated. 2) Scatter plots with 95% confidence intervals were plotted and the placing of identity lines was checked. 3) Bland-Altman plots were drawn so that the mean difference of attributes between the control method and ALS-method was calculated and plotted against average value of attributes. 4) The tolerance limits were defined and combined with Bland-Altman plots. The RMSE values were compared to a reference study from which the accuracy requirements had been set to the service provider. The accuracy requirements in Kuhmo were achieved, however comparison of RMSE values proved to be difficult. Field control measurements are costly and time-consuming, but they are considered to be robust. However, control measurements might include errors, which are difficult to take into account. Using the Bland-Altman plots none of the compared methods are considered to be completely exact, so this offers a fair way to interpret results of assessment. The tolerance limits to be set on order combined with Bland-Altman plots were suggested to be taken in practise. In addition, bias should be calculated for total area. Some other approaches for quality control were briefly examined. No method was found to fulfil all the required demands of statistical reliability, cost-efficiency, time efficiency, simplicity and speed of implementation. Some benefits and shortcomings of the studied methods were discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sustainability of food production has increasingly attracted the attention of consumers, farmers, food and retailing companies, and politicians. One manifestation of such attention is the growing interest in organic foods. Organic agriculture has the potential to enhance the ecological modernisation of food production by implementing the organic method as a preventative innovation that simultaneously produces environmental and economic benefits. However, in addition to the challenges to organic farming, the small market share of organic products in many countries today and Finland in particular risks undermining the achievement of such benefits. The problems identified as hindrances to the increased consumption of organic food are the poor availability, limited variety and high prices of organic products, the complicated buying decisions and the difficulties in delivering the intangible value of organic foods. Small volumes and sporadic markets, high costs, lack of market information, as well as poor supply reliability are obstacles to increasing the volume of organic production and processing. These problems shift the focus from a single actor to the entire supply chain and require solutions that involve more interaction among the actors within the organic chain. As an entity, the organic food chain has received very little scholarly attention. Researchers have mainly approached the organic chain from the perspective of a single actor, or they have described its structure rather than the interaction between the actors. Consequently, interaction among the primary actors in organic chains, i.e. farmers, manufacturers, retailers and consumers, has largely gone unexamined. The purpose of this study is to shed light on the interaction of the primary actors within a whole organic chain in relation to the ecological modernisation of food production. This information is organised into a conceptual framework to help illuminate this complex field. This thesis integrates the theories and concepts of three approaches: food system studies, supply chain management and ecological modernisation. Through a case study, a conceptual system framework will be developed and applied to a real life-situation. The thesis is supported by research published in four articles. All examine the same organic chains through case studies, but each approaches the problem from a different, complementary perspective. The findings indicated that regardless of the coherent values emphasising responsibility, the organic chains were loosely integrated to operate as a system. The focus was on product flow, leaving other aspects of value creation largely aside. Communication with consumers was rare, and none of the actors had taken a leading role in enhancing the market for organic products. Such a situation presents unsuitable conditions for ecological modernisation of food production through organic food and calls for contributions from stakeholders other than those directly involved in the product chain. The findings inspired a revision of the original conceptual framework. The revised framework, the three-layer framework , distinguishes the different layers of interaction. By gradually enlarging the chain orientation the different but interrelated layers become visible. A framework is thus provided for further research and for understanding practical implications of the performance of organic food chains. The revised framework provides both an ideal model for organic chains in relation to ecological modernisation and demonstrates a situation consistent with the empirical evidence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowing the chromosomal areas or actual genes affecting the traits under selection would add more information to be used in the selection decisions which would potentially lead to higher genetic response. The first objective of this study was to map quantitative trait loci (QTL) affecting economically important traits in the Finnish Ayrshire population. The second objective was to investigate the effects of using QTL information in marker-assisted selection (MAS) on the genetic response and the linkage disequilibrium between the different parts of the genome. Whole genome scans were carried out on a grand-daughter design with 12 half-sib families and a total of 493 sons. Twelve different traits were studied: milk yield, protein yield, protein content, fat yield, fat content, somatic cell score (SCS), mastitis treatments, other veterinary treatments, days open, fertility treatments, non-return rate, and calf mortality. The average spacing of the typed markers was 20 cM with 2 to 14 markers per chromosome. Associations between markers and traits were analyzed with multiple marker regression. Significance was determined by permutation and genome-wise P-values obtained by Bonferroni correction. The benefits from MAS were investigated by simulation: a conventional progeny testing scheme was compared to a scheme where QTL information was used within families to select among full-sibs in the male path. Two QTL on different chromosomes were modelled. The effects of different starting frequencies of the favourable alleles and different size of the QTL effects were evaluated. A large number of QTL, 48 in total, were detected at 5% or higher chromosome-wise significance. QTL for milk production were found on 8 chromosomes, for SCS on 6, for mastitis treatments on 1, for other veterinary treatments on 5, for days open on 7, for fertility treatments on 7, for calf mortality on 6, and for non-return rate on 2 chromosomes. In the simulation study the total genetic response was faster with MAS than with conventional selection and the advantage of MAS persisted over the studied generations. The rate of response and the difference between the selection schemes reflected clearly the changes in allele frequencies of the favourable QTL. The disequilibrium between the polygenes and QTL was always negative and it was larger with larger QTL size. The disequilibrium between the two QTL was larger with QTL of large effect and it was somewhat larger with MAS for scenarios with starting frequencies below 0.5 for QTL of moderate size and below 0.3 for large QTL. In conclusion, several QTL affecting economically important traits of dairy cattle were detected. Further studies are needed to verify these QTL, check their presence in the present breeding population, look for pleiotropy and fine map the most interesting QTL regions. The results of the simulation studies show that using MAS together with embryo transfer to pre-select young bulls within families is a useful approach to increase the genetic merit of the AI-bulls compared to conventional selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The aging population is placing increasing demands on surgical services, simultaneously with a decreasing supply of professional labor and a worsening economic situation. Under growing financial constraints, successful operating room management will be one of the key issues in the struggle for technical efficiency. This study focused on several issues affecting operating room efficiency. Materials and methods: The current formal operating room management in Finland and the use of performance metrics and information systems used to support this management were explored using a postal survey. We also studied the feasibility of a wireless patient tracking system as a tool for managing the process. The reliability of the system as well as the accuracy and precision of its automatically recorded time stamps were analyzed. The benefits of a separate anesthesia induction room in a prospective setting were compared with the traditional way of working, where anesthesia is induced in the operating room. Using computer simulation, several models of parallel processing for the operating room were compared with the traditional model with respect to cost-efficiency. Moreover, international differences in operating room times for two common procedures, laparoscopic cholecystectomy and open lung lobectomy, were investigated. Results: The managerial structure of Finnish operating units was not clearly defined. Operating room management information systems were found to be out-of-date, offering little support to online evaluation of the care process. Only about half of the information systems provided information in real time. Operating room performance was most often measured by the number of procedures in a time unit, operating room utilization, and turnover time. The wireless patient tracking system was found to be feasible for hospital use. Automatic documentation of the system facilitated patient flow management by increasing process transparency via more available and accurate data, while lessening work for staff. Any parallel work flow model was more cost-efficient than the traditional way of performing anesthesia induction in the operating room. Mean operating times for two common procedures differed by 50% among eight hospitals in different countries. Conclusions: The structure of daily operative management of an operating room warrants redefinition. Performance measures as well as information systems require updating. Parallel work flows are more cost-efficient than the traditional induction-in-room model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Corporate governance deals with the ways in which suppliers of finance to firms assure themselves of getting a return on their investment” (Shleifer and Vishny (1997, p. 737). According to La Porta et al. (1999), research in corporate finance relevant for most countries should focus on the incentives and capabilities of controlling shareholders to treat themselves preferentially at the expense of minority shareholders. Accordingly, this thesis sets out to answer a number of research questions regarding the role of large shareholders in public firms that have received little attention in the literature so far. A common theme in the essays stems from the costs and benefits of individual large-block owners and the role of control contestability from the perspective of outside minority shareholders. The first essay empirically examines whether there are systematic performance differences between family controlled and nonfamily controlled firms in Western Europe. In contrast to the widely held view that family control penalizes firm value, the essay shows that publicly traded family firms have higher performance than comparable firms. In the second essay, we present both theoretical and empirical analysis on the effects of control contestability on firm valuation. Consistent with the theoretical model, the empirical results show that minority shareholders benefit from a more contestable control structure. The third essay explores the effects of individual large-block owners on top management turnover and board appointments in Finnish listed firms. The results indicate that firm performance is an important determinant for management and board restructurings. For certain types of turnover decisions the corporate governance structure influences the performance / turnover sensitivity. In the fourth essay, we investigate the relation between the governance structure and dividend policy in Finnish listed firms. We find evidence in support of the outcome agency model of dividends stating that lower agency conflicts should be associated with higher dividend payouts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several researchers are of the opinion that there are many benefits in using the object-oriented paradigm in information systems development. If the object-oriented paradigm is used, the development of information systems may, for example, be faster and more efficient. On the other hand, there are also several problems with the paradigm. For example, it is often considered complex, it is often difficult to make use of the reuse concept and it is still immature in some areas. Although there are several interesting features in the object-oriented paradigm, there is still little comprehensive knowledge of the benefits and problems associated with it. The objective of the following study was to investigate and to gain more understanding of the benefits and problems of the object-oriented paradigm. A review of previous studies was made and twelve benefits and twelve problems were established. These benefits and problems were then analysed, studied and discussed. Further a survey and some case studies were made in order to get some knowledge on what benefits and problems with the object-oriented paradigm Finnish software companies had experienced. One hundred and four companies answered the survey that was sent to all Finnish software companies with five or more employees. The case studies were made with six large Finnish software companies. The major finding was that Finnish software companies were exceptionally positive towards the object-oriented information systems development and had experienced very few of the proposed problems. Finally two models for further research were developed. The first model presents connections between benefits and the second between problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work examines stable isotope ratios of carbon, oxygen and hydrogen in annual growth rings of trees. Isotopic composition in wood cellulose is used as a tool to study past climate. The method benefits from the accurate and precise dating provided by dendrochronology. In this study the origin, nature and the strength of climatic correlations are studied on different temporal scales and at different sites in Finland. The origin of carbon isotopic signal is in photosynthetic fractionation. The basic physical and chemical fractionations involved are reasonably well understood. This was confirmed by measuring instantaneous photosynthetic discrimination on Scots pine (Pinus sylvestris L.). The internal conductance of CO2 was recognized to have a significant impact on the observed fractionation, and further investigations are suggested to quantify its role in controlling the isotopic signal of photosynthates. Isotopic composition of the produced biomass can potentially be affected by variety of external factors that induce physiological changes in trees. Response of carbon isotopic signal in tree ring cellulose to changes in resource availability was assessed in a manipulation experiment. It showed that the signal was relatively stable despite of changes in water and nitrogen availability to the tree. Palaeoclimatic reconstructions are typically based on functions describing empirical relationship between isotopic and climatic parameters. These empirical relationships may change depending on the site conditions, species and timeframe studied. Annual variation in Scots pine tree ring carbon and oxygen isotopic composition was studied in northern and in central eastern Finland and annual variation in tree ring latewood carbon, oxygen and hydrogen isotopic ratio in Oak (Quercus robur L.) was studied in southern Finland. In all of the studied sites at least one of the studied isotope ratios was shown to record climate strongly enough to be used in climatic reconstructions. Using the observed relationships, four-century-long climate reconstructions from living Scots pine were created for northern and central eastern Finland. Also temporal stability of the relationships between three proxy indicators, tree ring growth and carbon and oxygen isotopic composition was studied during the four-hundred-year period. Isotope ratios measured from tree rings in Finland were shown to be sensitive indicators of climate. Increasing understanding of environmental controls and physiological mechanisms affecting tree ring isotopic composition will make possible more accurate interpretation of isotope data. This study also demonstrated that by measuring multiple isotopes and physical proxies from the same tree rings, additional information on tree physiology can be obtained. Thus isotopic ratios measured from tree ring cellulose provide means to improve the reliability of climate reconstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research analyzes product quality from a customer perspective in the case of the wood products industry. Of specific interest is to understand better how environmental quality is perceived from a customer perspective. The empirical material used comprises four data-sets from Finland, Germany and the UK, collected during 1992 2004. The methods consist of a set of quantitative statistical analyses. The results indicate that perceived quality from a customer perspective can be presented using a multidimensional and hierarchical construct with tangible and intangible dimensions, that is common to different markets and products. This applies in the case of wood products but also more generally at least for some other construction materials. For wood products, tangible product quality has two main sub-dimensions: technical quality and appearance. For product intangibles, a few main quality dimensions seem be detectable: Quality of intangibles related to the physical product, such as environmental issues and product-related information, supplier-related characteristics, and service and sales personnel behavior. Environmental quality and information are often perceived as being inter-related. Technical performance and appearance are the most important considerations for customers in the case of wood products. Organizational customers in particular also clearly consider certain intangible quality dimensions to be important, such as service and supplier reliability. The high technical quality may be considered as a license to operate , but product appearance and intangible quality provide potential for differentiation for attracting certain market segments. Intangible quality issues are those where Nordic suppliers underperform in comparison to their Central-European competitors on the important German markets. Environmental quality may not have been used to its full extent to attract customers. One possibility is to increase the availability of the environment-related information, or to develop environment-related product characteristics to also provide some individual benefits. Information technology provides clear potential to facilitate information-based quality improvements, which was clearly recognized by Finnish forest industry already in the early 1990s. The results indeed indicate that wood products markets are segmented with regard to quality demands

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the cost-benefit analysis of digital long-term preservation (LTP) that was carried out in the context of the Finnish National Digital Library Project (NDL) in 2010. The analysis was based on the assumption that as many as 200 archives, libraries, and museums will share an LTP system. The term ‘system’ shall be understood as encompassing not only information technology, but also human resources, organizational structures, policies and funding mechanisms. The cost analysis shows that an LTP system will incur, over the first 12 years, cumulative costs of €42 million, i.e. an average of €3.5 million per annum. Human resources and investments in information technology are the major cost factors. After the initial stages, the analysis predicts annual costs of circa €4 million. The analysis compared scenarios with and without a shared LTP system. The results indicate that a shared system will have remarkable benefits. At the development and implementation stages, a shared system shows an advantage of €30 million against the alternative scenario consisting of five independent LTP solutions. During the later stages, the advantage is estimated at €10 million per annum. The cumulative cost benefit over the first 12 years would amount to circa €100 million.