816 resultados para New Venture Performance
Resumo:
After the restructuring process of the power supply industry, which for instance in Finland took place in the mid-1990s, free competition was introduced for the production and sale of electricity. Nevertheless, natural monopolies are found to be the most efficient form of production in the transmission and distribution of electricity, and therefore such companies remained franchised monopolies. To prevent the misuse of the monopoly position and to guarantee the rights of the customers, regulation of these monopoly companies is required. One of the main objectives of the restructuring process has been to increase the cost efficiency of the industry. Simultaneously, demands for the service quality are increasing. Therefore, many regulatory frameworks are being, or have been, reshaped so that companies are provided with stronger incentives for efficiency and quality improvements. Performance benchmarking has in many cases a central role in the practical implementation of such incentive schemes. Economic regulation with performance benchmarking attached to it provides companies with directing signals that tend to affect their investment and maintenance strategies. Since the asset lifetimes in the electricity distribution are typically many decades, investment decisions have far-reaching technical and economic effects. This doctoral thesis addresses the directing signals of incentive regulation and performance benchmarking in the field of electricity distribution. The theory of efficiency measurement and the most common regulation models are presented. The chief contributions of this work are (1) a new kind of analysis of the regulatory framework, so that the actual directing signals of the regulation and benchmarking for the electricity distribution companies are evaluated, (2) developing the methodology and a software tool for analysing the directing signals of the regulation and benchmarking in the electricity distribution sector, and (3) analysing the real-life regulatory frameworks by the developed methodology and further develop regulation model from the viewpoint of the directing signals. The results of this study have played a key role in the development of the Finnish regulatory model.
Resumo:
The primary objective is to identify the critical factors that have a natural impact on the performance measurement system. It is important to make correct decisions related to measurement systems, which are based on the complex business environment. The performance measurement system is combined with a very complex non-linear factor. The Six Sigma methodology is seen as one potential approach at every organisational level. It will be linked to the performance and financial measurement as well as to the analytical thinking on which the viewpoint of management depends. The complex systems are connected to the customer relationship study. As the primary throughput can be seen in a new well-defined performance measurement structure that will also be facilitated as will an analytical multifactor system. These critical factors should also be seen as a business innovation opportunity at the same time. This master's thesis has been divided into two different theoretical parts. The empirical part consists of both action-oriented and constructive research approaches with an empirical case study. The secondary objective is to seek a competitive advantage factor with a new analytical tool and the Six Sigma thinking. Process and product capabilities will be linked to the contribution of complex system. These critical barriers will be identified by the performance measuring system. The secondary throughput can be recognised as the product and the process cost efficiencies which throughputs are achieved with an advantage of management. The performance measurement potential is related to the different productivity analysis. Productivity can be seen as one essential part of the competitive advantage factor.
Resumo:
The Cherenkov light flashes produced by Extensive Air Showers are very short in time. A high bandwidth and fast digitizing readout, therefore, can minimize the influence of the background from the light of the night sky, and improve the performance in Cherenkov telescopes. The time structure of the Cherenkov image can further be used in single-dish Cherenkov telescopes as an additional parameter to reduce the background from unwanted hadronic showers. A description of an analysis method which makes use of the time information and the subsequent improvement on the performance of the MAGIC telescope (especially after the upgrade with an ultra fast 2 GSamples/s digitization system in February 2007) will be presented. The use of timing information in the analysis of the new MAGIC data reduces the background by a factor two, which in turn results in an enhancement of about a factor 1.4 of the flux sensitivity to point-like sources, as tested on observations of the Crab Nebula.
Resumo:
Aim The aim of this study was to test different modelling approaches, including a new framework, for predicting the spatial distribution of richness and composition of two insect groups. Location The western Swiss Alps. Methods We compared two community modelling approaches: the classical method of stacking binary prediction obtained fromindividual species distribution models (binary stacked species distribution models, bS-SDMs), and various implementations of a recent framework (spatially explicit species assemblage modelling, SESAM) based on four steps that integrate the different drivers of the assembly process in a unique modelling procedure. We used: (1) five methods to create bS-SDM predictions; (2) two approaches for predicting species richness, by summing individual SDM probabilities or by modelling the number of species (i.e. richness) directly; and (3) five different biotic rules based either on ranking probabilities from SDMs or on community co-occurrence patterns. Combining these various options resulted in 47 implementations for each taxon. Results Species richness of the two taxonomic groups was predicted with good accuracy overall, and in most cases bS-SDM did not produce a biased prediction exceeding the actual number of species in each unit. In the prediction of community composition bS-SDM often also yielded the best evaluation score. In the case of poor performance of bS-SDM (i.e. when bS-SDM overestimated the prediction of richness) the SESAM framework improved predictions of species composition. Main conclusions Our results differed from previous findings using community-level models. First, we show that overprediction of richness by bS-SDM is not a general rule, thus highlighting the relevance of producing good individual SDMs to capture the ecological filters that are important for the assembly process. Second, we confirm the potential of SESAM when richness is overpredicted by bS-SDM; limiting the number of species for each unit and applying biotic rules (here using the ranking of SDM probabilities) can improve predictions of species composition
Resumo:
Coxiella burnetii and members of the genus Rickettsia are obligate intracellular bacteria. Since cultivation of these organisms requires dedicated techniques, their diagnosis usually relies on serological or molecular biology methods. Immunofluorescence is considered the gold standard to detect antibody-reactivity towards these organisms. Here, we assessed the performance of a new automated epifluorescence immunoassay (InoDiag) to detect IgM and IgG against C. burnetii, Rickettsia typhi and Rickettsia conorii. Samples were tested with the InoDiag assay. A total of 213 sera were tested, of which 63 samples from Q fever, 20 from spotted fever rickettsiosis, 6 from murine typhus and 124 controls. InoDiag results were compared to micro-immunofluorescence. For acute Q fever, the sensitivity of phase 2 IgG was only of 30% with a cutoff of 1 arbitrary unit (AU). In patients with acute Q fever with positive IF IgM, sensitivity reached 83% with the same cutoff. Sensitivity for chronic Q fever was 100% whereas sensitivity for past Q fever was 65%. Sensitivity for spotted Mediterranean fever and murine typhus were 91% and 100%, respectively. Both assays exhibited a good specificity in control groups, ranging from 79% in sera from patients with unrelated diseases or EBV positivity to 100% in sera from healthy patients. In conclusion, the InoDiag assay exhibits an excellent performance for the diagnosis of chronic Q fever but a very low IgG sensitivity for acute Q fever likely due to low reactivity of phase 2 antigens present on the glass slide. This defect is partially compensated by the detection of IgM. Because it exhibits a good negative predictive value, the InoDiag assay is valuable to rule out a chronic Q fever. For the diagnosis of rickettsial diseases, the sensitivity of the InoDiag method is similar to conventional immunofluorescence.
Resumo:
OBJECTIVE: To review and update the conceptual framework, indicator content and research priorities of the Organisation for Economic Cooperation and Development's (OECD) Health Care Quality Indicators (HCQI) project, after a decade of collaborative work. DESIGN: A structured assessment was carried out using a modified Delphi approach, followed by a consensus meeting, to assess the suite of HCQI for international comparisons, agree on revisions to the original framework and set priorities for research and development. SETTING: International group of countries participating to OECD projects. PARTICIPANTS: Members of the OECD HCQI expert group. RESULTS: A reference matrix, based on a revised performance framework, was used to map and assess all seventy HCQI routinely calculated by the OECD expert group. A total of 21 indicators were agreed to be excluded, due to the following concerns: (i) relevance, (ii) international comparability, particularly where heterogeneous coding practices might induce bias, (iii) feasibility, when the number of countries able to report was limited and the added value did not justify sustained effort and (iv) actionability, for indicators that were unlikely to improve on the basis of targeted policy interventions. CONCLUSIONS: The revised OECD framework for HCQI represents a new milestone of a long-standing international collaboration among a group of countries committed to building common ground for performance measurement. The expert group believes that the continuation of this work is paramount to provide decision makers with a validated toolbox to directly act on quality improvement strategies.
Resumo:
This study was aimed to analyze and assess the use and perception of electronic health records (EHRs) by nurses. The study sample included 113 nurses from different shifts of primary health facilities in Catalonia, Spain, devoted to adult as well as pediatric outpatients using EHRs throughout the year 2010. A majority of the sample (87.5%) were women and 12.5% were men. The average age was 44.27 years and the average time working in primary healthcare was 47.15 months. A majority (80.4%) received specific training on the use of the EHR and 19.6% did not. The use of the application required side technical support (mean: 3.42) and it is considered necessary to learn more about the performance of the application (mean: 3.50). The relationship between the average ratings that nurses have about the EHR and age shows that there is no statistically significant linear relationship (r = - 0.002, p-value = 0.984). As to how long they have used the EHRs, there are significant differences (r= -0.304, p-value = 0.00), so the more time the nurse takes using the EHR, the greater degree of satisfaction is shown. In addition, there are significant differences between nurses" perceptions regarding the EHR and gender (t = - 0.421, p-value = 0.675). Nurses assessed as positive the contribution of the EHRs in their nursing care day work (average score: 2.55/5). Considering that the usability of the EHR device is assessed as satisfactory, the results of the perception of nurses show that we must also take into account the training and emphasize the need for a side technical support in the implementation process of the EHR. Doing so, the positive perception that nurses have in regard to information and communication technology in general and with respect to the EHR in particular may be increased.
Resumo:
This study was aimed to analyze and assess the use and perception of electronic health records (EHRs) by nurses. The study sample included 113 nurses from different shifts of primary health facilities in Catalonia, Spain, devoted to adult as well as pediatric outpatients using EHRs throughout the year 2010. A majority of the sample (87.5%) were women and 12.5% were men. The average age was 44.27 years and the average time working in primary healthcare was 47.15 months. A majority (80.4%) received specific training on the use of the EHR and 19.6% did not. The use of the application required side technical support (mean: 3.42) and it is considered necessary to learn more about the performance of the application (mean: 3.50). The relationship between the average ratings that nurses have about the EHR and age shows that there is no statistically significant linear relationship (r = - 0.002, p-value = 0.984). As to how long they have used the EHRs, there are significant differences (r= -0.304, p-value = 0.00), so the more time the nurse takes using the EHR, the greater degree of satisfaction is shown. In addition, there are significant differences between nurses" perceptions regarding the EHR and gender (t = - 0.421, p-value = 0.675). Nurses assessed as positive the contribution of the EHRs in their nursing care day work (average score: 2.55/5). Considering that the usability of the EHR device is assessed as satisfactory, the results of the perception of nurses show that we must also take into account the training and emphasize the need for a side technical support in the implementation process of the EHR. Doing so, the positive perception that nurses have in regard to information and communication technology in general and with respect to the EHR in particular may be increased.
Resumo:
Autologous blood transfusion (ABT) is an efficient way to increase sport performance. It is also the most challenging doping method to detect. At present, individual follow-up of haematological variables via the athlete biological passport (ABP) is used to detect it. Quantification of a novel hepatic peptide called hepcidin may be a new alternative to detect ABT. In this prospective clinical trial, healthy subjects received a saline injection for the control phase, after which they donated blood that was stored and then transfused 36 days later. The impact of ABT on hepcidin as well as haematological parameters, iron metabolism, and inflammation markers was investigated. Blood transfusion had a particularly marked effect on hepcidin concentrations compared to the other biomarkers, which included haematological variables. Hepcidin concentrations increased significantly: 12 hr and 1 day after blood reinfusion, these concentrations rose by seven- and fourfold, respectively. No significant change was observed in the control phase. Hepcidin quantification is a cost-effective strategy that could be used in an "ironomics" strategy to improve the detection of ABT. Am. J. Hematol. 91:467-472, 2016. © 2016 Wiley Periodicals, Inc.
Resumo:
High-resolution mass spectrometry (HRMS) has been associated with qualitative and research analysis and QQQ-MS with quantitative and routine analysis. This view is now challenged and for this reason, we have evaluated the quantitative LC-MS performance of a new high-resolution mass spectrometer (HRMS), a Q-orbitrap-MS, and compared the results obtained with a recent triple-quadrupole MS (QQQ-MS). High-resolution full-scan (HR-FS) and MS/MS acquisitions have been tested with real plasma extracts or pure standards. Limits of detection, dynamic range, mass accuracy and false positive or false negative detections have been determined or investigated with protease inhibitors, tyrosine kinase inhibitors, steroids and metanephrines. Our quantitative results show that today's available HRMS are reliable and sensitive quantitative instruments and comparable to QQQ-MS quantitative performance. Taking into account their versatility, user-friendliness and robustness, we believe that HRMS should be seen more and more as key instruments in quantitative LC-MS analyses. In this scenario, most targeted LC-HRMS analyses should be performed by HR-FS recording virtually "all" ions. In addition to absolute quantifications, HR-FS will allow the relative quantifications of hundreds of metabolites in plasma revealing individual's metabolome and exposome. This phenotyping of known metabolites should promote HRMS in clinical environment. A few other LC-HRMS analyses should be performed in single-ion-monitoring or MS/MS mode when increased sensitivity and/or detection selectivity will be necessary.
Resumo:
BACKGROUND: For the past decade (18)F-fluoro-ethyl-l-tyrosine (FET) and (18)F-fluoro-deoxy-glucose (FDG) positron emission tomography (PET) have been used for the assessment of patients with brain tumor. However, direct comparison studies reported only limited numbers of patients. Our purpose was to compare the diagnostic performance of FET and FDG-PET. METHODS: We examined studies published between January 1995 and January 2015 in the PubMed database. To be included the study should: (i) use FET and FDG-PET for the assessment of patients with isolated brain lesion and (ii) use histology as the gold standard. Analysis was performed on a per patient basis. Study quality was assessed with STARD and QUADAS criteria. RESULTS: Five studies (119 patients) were included. For the diagnosis of brain tumor, FET-PET demonstrated a pooled sensitivity of 0.94 (95% CI: 0.79-0.98) and pooled specificity of 0.88 (95% CI: 0.37-0.99), with an area under the curve of 0.96 (95% CI: 0.94-0.97), a positive likelihood ratio (LR+) of 8.1 (95% CI: 0.8-80.6), and a negative likelihood ratio (LR-) of 0.07 (95% CI: 0.02-0.30), while FDG-PET demonstrated a sensitivity of 0.38 (95% CI: 0.27-0.50) and specificity of 0.86 (95% CI: 0.31-0.99), with an area under the curve of 0.40 (95% CI: 0.36-0.44), an LR+ of 2.7 (95% CI: 0.3-27.8), and an LR- of 0.72 (95% CI: 0.47-1.11). Target-to-background ratios of either FDG or FET, however, allow distinction between low- and high-grade gliomas (P > .11). CONCLUSIONS: For brain tumor diagnosis, FET-PET performed much better than FDG and should be preferred when assessing a new isolated brain tumor. For glioma grading, however, both tracers showed similar performances.
Resumo:
The fundamental question in the transitional economies of the former Eastern Europe and Soviet Union has been whether privatisation and market liberalisation have had an effect on the performance of former state-owned enterprises. This study examines the effect of privatisation, capital market discipline, price liberalisation and international price exposure on the restructuring of large Russian enterprises. The performance indicators are sales, profitability, labour productivity and stock market valuations. The results do not show performance differences between state-owned and privatised enterprises. On the other hand, the expansion of the de novo private sector has been strong. New enterprises have significantly higher sales growth, profitability and labour productivity. However, the results indicate a diminishing effect of ownership. The international stock market listing has a significant positive effect on profitability, while the effect of domestic stock market listing is insignificant. The international price exposure has a significant positive increasing effect on profitability and labour productivity. International enterprises have higher profitability only when operating on price liberalised markets, however. The main results of the study are strong evidence on the positive effects of international linkages on the enterprise restructuring and the higher than expected role of new enterprises in the Russian economy.
Resumo:
Performance standards for Positron emission tomography (PET) were developed to be able to compare systems from different generations and manufacturers. This resulted in the NEMA methodology in North America and the IEC in Europe. In practices, the NEMA NU 2- 2001 is the method of choice today. These standardized methods allow assessment of the physical performance of new commercial dedicated PET/CT tomographs. The point spread in image formation is one of the factors that blur the image. The phenomenon is often called the partial volume effect. Several methods for correcting for partial volume are under research but no real agreement exists on how to solve it. The influence of the effect varies in different clinical settings and it is likely that new methods are needed to solve this problem. Most of the clinical PET work is done in the field of oncology. The whole body PET combined with a CT is the standard investigation today in oncology. Despite the progress in PET imaging technique visualization, especially quantification of small lesions is a challenge. In addition to partial volume, the movement of the object is a significant source of error. The main causes of movement are respiratory and cardiac motions. Most of the new commercial scanners are in addition to cardiac gating, also capable of respiratory gating and this technique has been used in patients with cancer of the thoracic region and patients being studied for the planning of radiation therapy. For routine cardiac applications such as assessment of viability and perfusion only cardiac gating has been used. However, the new targets such as plaque or molecular imaging of new therapies require better control of the cardiac motion also caused by respiratory motion. To overcome these problems in cardiac work, a dual gating approach has been proposed. In this study we investigated the physical performance of a new whole body PET/CT scanner with NEMA standard, compared methods for partial volume correction in PET studies of the brain and developed and tested a new robust method for dual cardiac-respiratory gated PET with phantom, animal and human data. Results from performance measurements showed the feasibility of the new scanner design in 2D and 3D whole body studies. Partial volume was corrected, but there is no best method among those tested as the correction also depends on the radiotracer and its distribution. New methods need to be developed for proper correction. The dual gating algorithm generated is shown to handle dual-gated data, preserving quantification and clearly eliminating the majority of contraction and respiration movement
Resumo:
Statistics show that the expanding service sector accounts already for three quarters of GDP in the developed economies. Moreover, there is abundant evidence on high variation in productive performance across the service industries. This suggests divergent technological and institutional trajectories within the tertiary sector. While conceptual knowledge on services and their performance has accumulated substantially, the overall landscape on productivity and competitiveness is still inconclusive. As noted by number of authors the research on service productivity is still in its infancy. The purpose of this paper is to develop further the analytical framework of service productivity. The approach is based on the notion that service definitions, classifications and performance measurement are strongly interdependent. Given the ongoing restructuring of businesses activities with higher information content, it is argued that the dichotomy between manufacturing and services should not be taken too far. Industrial evolution also suggests that the official industry classifications are increasingly outdated and new taxonomies for empirical research are therefore needed. Based on the previous analyses and new insights the paper clarifies the debated concept of service productivity and identifies the critical dimensions by which the service industries cluster. It is also demonstrated that the dimensions enable to construct new service taxonomies which bear essentially on productivity opportunities at the business level. Needles to say the key determinant explaining the development and potential of productivity growth is innovation activity. As an extensive topic of research, however, service innovation is tackled here only in a cursory way. The paper is constructed as follows: the first section focuses on the conceptual issues and evolving nature of service activities. A workable definition of service should capture the diversity of service activities, as well as the aspects of service processes, comprehensively. The distinctions and similarities between services and manufacturing are discussed, too. Section 2 deals with the service productivity, a persistent and controversial issue in academic literature and policy. With the assessments of strengths and weaknesses of the main schools new insights based on value creation will be brought in. Industry classifications and taxonomies are discussed in Section 3. It begins with a short analysis of the official classifications and their evaluation from the perspective of empirical research. Using well-known examples it is shown that the taxonomies on the manufacturing industries have a clear analogy with the business services. As there is a growing interest to regroup services too, the work to date, has been less systematic and inherently qualitative. Based on the earlier contributions threedimensional service taxonomy is constructed which highlight the key dimensions of productive performance. The main findings and implications are summed up in Section 4.
Resumo:
Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.