951 resultados para measurement data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this research is creating a performance measurement system for accounting services of a large paper industry company. In this thesis there are compared different performance measurement system and then selected two systems, which are presented and compared more detailed. Performance Prism system is the used framework in this research. Performance Prism using success maps to determining objectives. Model‟s target areas are divided into five groups: stakeholder satisfaction, stakeholder contribution, strategy, processes and capabilities. The measurement system creation began by identifying stakeholders and defining their objectives. Based on the objectives are created success map. Measures are created based on the objectives and success map. Then is defined needed data for measures. In the final measurement system, there are total just over 40 measures. Each measure is defined specific target level and ownership. Number of measures is fairly large, but this is the first version of the measurement system, so the amount is acceptable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar radiation is an important factor for plant growth, being its availability to understory crops strongly modified by trees in an Agroforestry System (AFS). Coffee trees (Coffea arabica - cv. Obatã IAC 1669-20) were planted at a 3.4 x 0.9 m spacing inside and aside rows of monocrops of 12 year-old rubber trees (Hevea spp.), in Piracicaba-SP, Brazil (22º42'30" S, 47º38'00" W - altitude: 546m). One-year-old coffee plants exposed to 25; 30; 35; 40; 45; 80; 90; 95 and 100% of the total solar radiation were evaluated according to its biophysical parameters of solar radiation interception and capture. The Goudriaan (1977) adapted by Bernardes et al. (1998) model for radiation attenuation fit well to the measured data. Coffee plants tolerate a decrease in solar radiation availability to 50% without undergoing a reduction on growth and LAI, which was approximately 2m².m-2 under this condition. Further reductions on the availability of solar radiation caused a reduction in LAI (1.5m².m-2), thus poor land cover and solar radiation interception, resulting in growth reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new methodologies and tools that enable to determine the water content in soil is of fundamental importance to the practice of irrigation. The objective of this study was to evaluate soil matric potential using mercury tensiometer and puncture digital tensiometer, and to compare the gravimetric soil moisture values obtained by tensiometric system with gravimetric soil moisture obtained by neutron attenuation technique. Four experimental plots were maintained with different soil moisture by irrigation. Three repetitions of each type of tensiometer were installed at 0.20 m depth. Based on the soil matric potential and the soil water retention curve, the corresponding gravimetric soil moisture was determined. The data was then compared to those obtained by neutron attenuation technique. The results showed that both tensiometric methods showed no difference under soil matric potential higher than -40 kPa. However, under drier soil, when the water was replaced by irrigation, the soil matric potential of the puncture digital tensiometer was less than those of the mercury tensiometer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concerns performance measurement and management in a collaborative network. Collaboration between companies has been increased in recent years due to the turbulent operating environment. The literature shows that there is a need for more comprehensive research on performance measurement in networks and the use of measurement information in their management. This study examines the development process and uses of a performance measurement system supporting performance management in a collaborative network. There are two main research questions: how to design a performance measurement system for a collaborative network and how to manage performance in a collaborative network. The work can be characterised as a qualitative single case study. The empirical data was collected in a Finnish collaborative network, which consists of a leading company and a reseller network. The work is based on five research articles applying various research methods. The research questions are examined at the network level and at the single network partner level. The study contributes to the earlier literature by producing new and deeper understanding of network-level performance measurement and management. A three-step process model is presented to support the performance measurement system design process. The process model has been tested in another collaborative network. The study also examines the factors affecting the process of designing the measurement system. The results show that a participatory development style, network culture, and outside facilitators have a positive effect on the design process. The study increases understanding of how to manage performance in a collaborative network and what kind of uses of performance information can be identified in a collaborative network. The results show that the performance measurement system is an applicable tool to manage the performance of a network. The results reveal that trust and openness increased during the utilisation of the performance measurement system, and operations became more transparent. The study also presents a management model that evaluates the maturity of performance management in a collaborative network. The model is a practical tool that helps to analyse the current stage of the performance management of a collaborative network and to develop it further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

R,S-sotalol, a ß-blocker drug with class III antiarrhythmic properties, is prescribed to patients with ventricular, atrial and supraventricular arrhythmias. A simple and sensitive method based on HPLC-fluorescence is described for the quantification of R,S-sotalol racemate in 500 µl of plasma. R,S-sotalol and its internal standard (atenolol) were eluted after 5.9 and 8.5 min, respectively, from a 4-micron C18 reverse-phase column using a mobile phase consisting of 80 mM KH2PO4, pH 4.6, and acetonitrile (95:5, v/v) at a flow rate of 0.5 ml/min with detection at lex = 235 nm and lem = 310 nm, respectively. This method, validated on the basis of R,S-sotalol measurements in spiked blank plasma, presented 20 ng/ml sensitivity, 20-10,000 ng/ml linearity, and 2.9 and 4.8% intra- and interassay precision, respectively. Plasma sotalol concentrations were determined by applying this method to investigate five high-risk patients with atrial fibrillation admitted to the Emergency Service of the Medical School Hospital, who received sotalol, 160 mg po, as loading dose. Blood samples were collected from a peripheral vein at zero, 0.5, 1.0, 1.5, 2.0, 3.0, 4.0, 6.0, 8.0, 12.0 and 24.0 h after drug administration. A two-compartment open model was applied. Data obtained, expressed as mean, were: CMAX = 1230 ng/ml, TMAX = 1.8 h, AUCT = 10645 ng h-1 ml-1, Kab = 1.23 h-1, a = 0.95 h-1, ß = 0.09 h-1, t(1/2)ß = 7.8 h, ClT/F = 3.94 ml min-1 kg-1, and Vd/F = 2.53 l/kg. A good systemic availability and a fast absorption were obtained. Drug distribution was reduced to the same extent in terms of total body clearance when patients and healthy volunteers were compared, and consequently elimination half-life remained unchanged. Thus, the method described in the present study is useful for therapeutic drug monitoring purposes, pharmacokinetic investigation and pharmacokinetic-pharmacodynamic sotalol studies in patients with tachyarrhythmias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decade customer loyalty programs have become very popular and almost every retail chain seems to have one. Through the loyalty programs companies are able to collect information about the customer behavior and to use this information in business and marketing management to guide decision making and resource allocation. The benefits for the loyalty program member are often monetary, which has an effect on the profitability of the loyalty program. Not all the loyalty program members are equally profitable, as some purchase products for the recommended retail price and some buy only discounted products. If the company spends similar amount of resources to all members, it can be seen that the customer margin is lower on the customer who bought only discounted products. It is vital for a company to measure the profitability of their members in order to be able to calculate the customer value. To calculate the customer value several different customer value metrics can be used. During the recent years especially customer lifetime value has received a lot of attention and it is seen to be superior against other customer value metrics. In this master’s thesis the customer lifetime value is implemented on the case company’s customer loyalty program. The data was collected from the customer loyalty program’s database and represents year 2012 on the Finnish market. The data was not complete to fully take advantage of customer lifetime value and as a conclusion it can be stated that a new key performance indicator of customer margin should be acquired in order to profitably drive the business of the customer loyalty program. Through the customer margin the company would be able to compute the customer lifetime value on regular basis enabling efficient resource allocation in marketing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance measurement produces information about the operation of the business process. On the basis of this information performance of the company can be followed and improved. Balanced performance measurement system can monitor performance of several perspectives and business processes can be led according to company strategy. Major part of the costs of a company is originated from purchased goods or services are an output of the buying process emphasising the importance of a reliable performance measurement of purchasing process. In the study, theory of balanced performance measurement is orientated and framework of purchasing process performance measurement system is designed. The designed balanced performance measurement system of purchasing process is tested in case company paying attention to the available data and to other environmental enablers. The balanced purchasing performance measurement system is tested and improved during the test period and attention is paid to the definition and scaling of objectives. Found development initiatives are carried out especially in the scaling of indicators. Finally results of the study are evaluated, conclusions and additional research areas proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to measure full epidermal thickness, stratum corneum thickness, rete length, dermal papilla widening and suprapapillary epidermal thickness in psoriasis patients using a light microscope and computer-supported image analysis. The data obtained were analyzed in terms of patient age, type of psoriasis, total body surface area involvement, scalp and nail involvement, duration of psoriasis, and family history of the disease. The study was conducted on 64 patients and 57 controls whose skin biopsies were examined by light microscopy. The acquired microscopic images were transferred to a computer and measurements were made using image analysis. The skin biopsies, taken from different body areas, were examined for different parameters such as epidermal, corneal and suprapapillary epidermal thickness. The most prominent increase in thickness was detected in the palmar region. Corneal thickness was more pronounced in patients with scalp involvement than in patients without scalp involvement (t = -2.651, P = 0.008). The most prominent increase in rete length was observed in the knees (median: 491 µm, t = 10.117, P = 0.000). The difference in rete length between patients with a positive and a negative family history was significant (t = -3.334, P = 0.03), being 27% greater in psoriasis patients without a family history. The differences in dermal papilla distances among patients were very small. We conclude that microscope-supported thickness measurements provide objective results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to invert the ionospheric electron density profile from Riometer (Relative Ionospheric opacity meter) measurement. The newly Riometer instrument KAIRA (Kilpisjärvi Atmospheric Imaging Receiver Array) is used to measure the cosmic HF radio noise absorption that taking place in the D-region ionosphere between 50 to 90 km. In order to invert the electron density profile synthetic data is used to feed the unknown parameter Neq using spline height method, which works by taking electron density profile at different altitude. Moreover, smoothing prior method also used to sample from the posterior distribution by truncating the prior covariance matrix. The smoothing profile approach makes the problem easier to find the posterior using MCMC (Markov Chain Monte Carlo) method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present study, we modeled a reaching task as a two-link mechanism. The upper arm and forearm motion trajectories during vertical arm movements were estimated from the measured angular accelerations with dual-axis accelerometers. A data set of reaching synergies from able-bodied individuals was used to train a radial basis function artificial neural network with upper arm/forearm tangential angular accelerations. The trained radial basis function artificial neural network for the specific movements predicted forearm motion from new upper arm trajectories with high correlation (mean, 0.9149-0.941). For all other movements, prediction was low (range, 0.0316-0.8302). Results suggest that the proposed algorithm is successful in generalization over similar motions and subjects. Such networks may be used as a high-level controller that could predict forearm kinematics from voluntary movements of the upper arm. This methodology is suitable for restoring the upper limb functions of individuals with motor disabilities of the forearm, but not of the upper arm. The developed control paradigm is applicable to upper-limb orthotic systems employing functional electrical stimulation. The proposed approach is of great significance particularly for humans with spinal cord injuries in a free-living environment. The implication of a measurement system with dual-axis accelerometers, developed for this study, is further seen in the evaluation of movement during the course of rehabilitation. For this purpose, training-related changes in synergies apparent from movement kinematics during rehabilitation would characterize the extent and the course of recovery. As such, a simple system using this methodology is of particular importance for stroke patients. The results underlie the important issue of upper-limb coordination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phosphatidylserine (PS) exposure occurs during the cell death program and fluorescein-labeled lactadherin permits the detection of PS exposure earlier than annexin V in suspended cell lines. Adherent cell lines were studied for this apoptosis-associated phenomenon to determine if PS probing methods are reliable because specific membrane damage may occur during harvesting. Apoptosis was induced in the human tongue squamous carcinoma cell line (Tca8113) and the adenoid cystic carcinoma cell line (ACC-2) by arsenic trioxide. Cells were harvested with a modified procedure and labeled with lactadherin and/or annexin V. PS exposure was localized by confocal microscopy and apoptosis was quantified by flow cytometry. The detachment procedure without trypsinization did not induce cell damage. In competition binding experiments, phospholipid vesicles competed for more than 95 and 90% of lactadherin but only about 75 and 70% of annexin V binding to Tca8113 and ACC-2 cells. These data indicate that PS exposure occurs in three stages during the cell death program and that fluorescein-labeled lactadherin permitted the detection of early PS exposure. A similar pattern of PS exposure has been observed in two malignant cell lines with different adherence, suggesting that this pattern of PS exposure is common in adherent cells. Both lactadherin and annexin V could be used in adherent Tca8113 and ACC-2 cell lines when an appropriate harvesting procedure was used. Lactadherin is more sensitive than annexin V for the detection of PS exposure as the physical structure of PS in these blebs and condensed apoptotic cell surface may be more conducive to binding lactadherin than annexin V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this master’s thesis was to find out, how to improve customer experience management and measurement. This study is a qualitative case study, in which the data collection method has been interviews. In addition, some of the company’s customer experience measurement methods have been analyzed. The theoretical background is applied in practice by interviewing 5 representatives from the case company. In the case company, the management has launched a customer experience focused program, and given guidelines for customer experience improvement. In the case company, customer experience is measured with different methods, one example is asking the recommendation readiness from a customer. In order to improve the customer experience management, the case company should define, what the company means with customer experience and what kind of customer experience the company is aiming to create. After the encounter, the customer should be left with feelings of satisfaction, positivity and trust. The company should focus on easiness in its processes, on top of which the processes should work fluently. The customer experience management should be improved through systematic planning, and by combining and standardizing different measures. In addition, some channel-based measures should be used. The measurement conducted should be more customer focused, and the case company should form an understanding, which touch points are the most relevant to measure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presented the overview of Open Data research area, quantity of evidence and establishes the research evidence based on the Systematic Mapping Study (SMS). There are 621 such publications were identified published between years 2005 and 2014, but only 243 were selected in the review process. This thesis highlights the implications of Open Data principals’ proliferation in the emerging era of the accessibility, reusability and sustainability of data transparency. The findings of mapping study are described in quantitative and qualitative measurement based on the organization affiliation, countries, year of publications, research method, star rating and units of analysis identified. Furthermore, units of analysis were categorized by development lifecycle, linked open data, type of data, technical platforms, organizations, ontology and semantic, adoption and awareness, intermediaries, security and privacy and supply of data which are important component to provide a quality open data applications and services. The results of the mapping study help the organizations (such as academia, government and industries), re-searchers and software developers to understand the existing trend of open data, latest research development and the demand of future research. In addition, the proposed conceptual framework of Open Data research can be adopted and expanded to strengthen and improved current open data applications.