915 resultados para measurement error model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article deals with a contour error controller (CEC) applied in a high speed biaxial table. It works simultaneously with the table axes controllers, helping them. In the early stages of the investigation, it was observed that its main problem is imprecision when tracking non-linear contours at high speeds. The objectives of this work are to show that this problem is caused by the lack of exactness of the contour error mathematical model and to propose modifications in it. An additional term is included, resulting in a more accurate value of the contour error, enabling the use of this type of motion controller at higher feedrate. The response results from simulated and experimental tests are compared with those of common PID and non-corrected CEC in order to analyse the effectiveness of this controller over the system. The main conclusions are that the proposed contour error mathematical model is simple, accurate, almost insensible to the feedrate and that a 20:1 reduction of the integral absolute contour error is possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concerns performance measurement and management in a collaborative network. Collaboration between companies has been increased in recent years due to the turbulent operating environment. The literature shows that there is a need for more comprehensive research on performance measurement in networks and the use of measurement information in their management. This study examines the development process and uses of a performance measurement system supporting performance management in a collaborative network. There are two main research questions: how to design a performance measurement system for a collaborative network and how to manage performance in a collaborative network. The work can be characterised as a qualitative single case study. The empirical data was collected in a Finnish collaborative network, which consists of a leading company and a reseller network. The work is based on five research articles applying various research methods. The research questions are examined at the network level and at the single network partner level. The study contributes to the earlier literature by producing new and deeper understanding of network-level performance measurement and management. A three-step process model is presented to support the performance measurement system design process. The process model has been tested in another collaborative network. The study also examines the factors affecting the process of designing the measurement system. The results show that a participatory development style, network culture, and outside facilitators have a positive effect on the design process. The study increases understanding of how to manage performance in a collaborative network and what kind of uses of performance information can be identified in a collaborative network. The results show that the performance measurement system is an applicable tool to manage the performance of a network. The results reveal that trust and openness increased during the utilisation of the performance measurement system, and operations became more transparent. The study also presents a management model that evaluates the maturity of performance management in a collaborative network. The model is a practical tool that helps to analyse the current stage of the performance management of a collaborative network and to develop it further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

R,S-sotalol, a ß-blocker drug with class III antiarrhythmic properties, is prescribed to patients with ventricular, atrial and supraventricular arrhythmias. A simple and sensitive method based on HPLC-fluorescence is described for the quantification of R,S-sotalol racemate in 500 µl of plasma. R,S-sotalol and its internal standard (atenolol) were eluted after 5.9 and 8.5 min, respectively, from a 4-micron C18 reverse-phase column using a mobile phase consisting of 80 mM KH2PO4, pH 4.6, and acetonitrile (95:5, v/v) at a flow rate of 0.5 ml/min with detection at lex = 235 nm and lem = 310 nm, respectively. This method, validated on the basis of R,S-sotalol measurements in spiked blank plasma, presented 20 ng/ml sensitivity, 20-10,000 ng/ml linearity, and 2.9 and 4.8% intra- and interassay precision, respectively. Plasma sotalol concentrations were determined by applying this method to investigate five high-risk patients with atrial fibrillation admitted to the Emergency Service of the Medical School Hospital, who received sotalol, 160 mg po, as loading dose. Blood samples were collected from a peripheral vein at zero, 0.5, 1.0, 1.5, 2.0, 3.0, 4.0, 6.0, 8.0, 12.0 and 24.0 h after drug administration. A two-compartment open model was applied. Data obtained, expressed as mean, were: CMAX = 1230 ng/ml, TMAX = 1.8 h, AUCT = 10645 ng h-1 ml-1, Kab = 1.23 h-1, a = 0.95 h-1, ß = 0.09 h-1, t(1/2)ß = 7.8 h, ClT/F = 3.94 ml min-1 kg-1, and Vd/F = 2.53 l/kg. A good systemic availability and a fast absorption were obtained. Drug distribution was reduced to the same extent in terms of total body clearance when patients and healthy volunteers were compared, and consequently elimination half-life remained unchanged. Thus, the method described in the present study is useful for therapeutic drug monitoring purposes, pharmacokinetic investigation and pharmacokinetic-pharmacodynamic sotalol studies in patients with tachyarrhythmias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind energy has obtained outstanding expectations due to risks of global warming and nuclear energy production plant accidents. Nowadays, wind farms are often constructed in areas of complex terrain. A potential wind farm location must have the site thoroughly surveyed and the wind climatology analyzed before installing any hardware. Therefore, modeling of Atmospheric Boundary Layer (ABL) flows over complex terrains containing, e.g. hills, forest, and lakes is of great interest in wind energy applications, as it can help in locating and optimizing the wind farms. Numerical modeling of wind flows using Computational Fluid Dynamics (CFD) has become a popular technique during the last few decades. Due to the inherent flow variability and large-scale unsteadiness typical in ABL flows in general and especially over complex terrains, the flow can be difficult to be predicted accurately enough by using the Reynolds-Averaged Navier-Stokes equations (RANS). Large- Eddy Simulation (LES) resolves the largest and thus most important turbulent eddies and models only the small-scale motions which are more universal than the large eddies and thus easier to model. Therefore, LES is expected to be more suitable for this kind of simulations although it is computationally more expensive than the RANS approach. With the fast development of computers and open-source CFD software during the recent years, the application of LES toward atmospheric flow is becoming increasingly common nowadays. The aim of the work is to simulate atmospheric flows over realistic and complex terrains by means of LES. Evaluation of potential in-land wind park locations will be the main application for these simulations. Development of the LES methodology to simulate the atmospheric flows over realistic terrains is reported in the thesis. The work also aims at validating the LES methodology at a real scale. In the thesis, LES are carried out for flow problems ranging from basic channel flows to real atmospheric flows over one of the most recent real-life complex terrain problems, the Bolund hill. All the simulations reported in the thesis are carried out using a new OpenFOAM® -based LES solver. The solver uses the 4th order time-accurate Runge-Kutta scheme and a fractional step method. Moreover, development of the LES methodology includes special attention to two boundary conditions: the upstream (inflow) and wall boundary conditions. The upstream boundary condition is generated by using the so-called recycling technique, in which the instantaneous flow properties are sampled on aplane downstream of the inlet and mapped back to the inlet at each time step. This technique develops the upstream boundary-layer flow together with the inflow turbulence without using any precursor simulation and thus within a single computational domain. The roughness of the terrain surface is modeled by implementing a new wall function into OpenFOAM® during the thesis work. Both, the recycling method and the newly implemented wall function, are validated for the channel flows at relatively high Reynolds number before applying them to the atmospheric flow applications. After validating the LES model over simple flows, the simulations are carried out for atmospheric boundary-layer flows over two types of hills: first, two-dimensional wind-tunnel hill profiles and second, the Bolund hill located in Roskilde Fjord, Denmark. For the twodimensional wind-tunnel hills, the study focuses on the overall flow behavior as a function of the hill slope. Moreover, the simulations are repeated using another wall function suitable for smooth surfaces, which already existed in OpenFOAM® , in order to study the sensitivity of the flow to the surface roughness in ABL flows. The simulated results obtained using the two wall functions are compared against the wind-tunnel measurements. It is shown that LES using the implemented wall function produces overall satisfactory results on the turbulent flow over the two-dimensional hills. The prediction of the flow separation and reattachment-length for the steeper hill is closer to the measurements than the other numerical studies reported in the past for the same hill geometry. The field measurement campaign performed over the Bolund hill provides the most recent field-experiment dataset for the mean flow and the turbulence properties. A number of research groups have simulated the wind flows over the Bolund hill. Due to the challenging features of the hill such as the almost vertical hill slope, it is considered as an ideal experimental test case for validating micro-scale CFD models for wind energy applications. In this work, the simulated results obtained for two wind directions are compared against the field measurements. It is shown that the present LES can reproduce the complex turbulent wind flow structures over a complicated terrain such as the Bolund hill. Especially, the present LES results show the best prediction of the turbulent kinetic energy with an average error of 24.1%, which is a 43% smaller than any other model results reported in the past for the Bolund case. Finally, the validated LES methodology is demonstrated to simulate the wind flow over the existing Muukko wind farm located in South-Eastern Finland. The simulation is carried out only for one wind direction and the results on the instantaneous and time-averaged wind speeds are briefly reported. The demonstration case is followed by discussions on the practical aspects of LES for the wind resource assessment over a realistic inland wind farm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research concerns different statistical methods that assist to increase the demand forecasting accuracy of company X’s forecasting model. Current forecasting process was analyzed in details. As a result, graphical scheme of logical algorithm was developed. Based on the analysis of the algorithm and forecasting errors, all the potential directions for model future improvements in context of its accuracy were gathered into the complete list. Three improvement directions were chosen for further practical research, on their basis, three test models were created and verified. Novelty of this work lies in the methodological approach of the original analysis of the model, which identified its critical points, as well as the uniqueness of the developed test models. Results of the study formed the basis of the grant of the Government of St. Petersburg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods have been described to measure intraocular pressure (IOP) in clinical and research situations. However, the measurement of time varying IOP with high accuracy, mainly in situations that alter corneal properties, has not been reported until now. The present report describes a computerized system capable of recording the transitory variability of IOP, which is sufficiently sensitive to reliably measure ocular pulse peak-to-peak values. We also describe its characteristics and discuss its applicability to research and clinical studies. The device consists of a pressure transducer, a signal conditioning unit and an analog-to-digital converter coupled to a video acquisition board. A modified Cairns trabeculectomy was performed in 9 Oryctolagus cuniculus rabbits to obtain changes in IOP decay parameters and to evaluate the utility and sensitivity of the recording system. The device was effective for the study of kinetic parameters of IOP, such as decay pattern and ocular pulse waves due to cardiac and respiratory cycle rhythm. In addition, there was a significant increase of IOP versus time curve derivative when pre- and post-trabeculectomy recordings were compared. The present procedure excludes corneal thickness and error related to individual operator ability. Clinical complications due to saline infusion and pressure overload were not observed during biomicroscopic evaluation. Among the disadvantages of the procedure are the requirement of anesthesia and the use in acute recordings rather than chronic protocols. Finally, the method described may provide a reliable alternative for the study of ocular pressure dynamic alterations in man and may facilitate the investigation of the pathogenesis of glaucoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Caco-2 cell line has been used as a model to predict the in vitro permeability of the human intestinal barrier. The predictive potential of the assay relies on an appropriate in-house validation of the method. The objective of the present study was to develop a single HPLC-UV method for the identification and quantitation of marker drugs and to determine the suitability of the Caco-2 cell permeability assay. A simple chromatographic method was developed for the simultaneous determination of both passively (propranolol, carbamazepine, acyclovir, and hydrochlorothiazide) and actively transported drugs (vinblastine and verapamil). Separation was achieved on a C18 column with step-gradient elution (acetonitrile and aqueous solution of ammonium acetate, pH 3.0) at a flow rate of 1.0 mL/min and UV detection at 275 nm during the total run time of 35 min. The method was validated and found to be specific, linear, precise, and accurate. This chromatographic system can be readily used on a routine basis and its utilization can be extended to other permeability models. The results obtained in the Caco-2 bi-directional transport experiments confirmed the validity of the assay, given that high and low permeability profiles were identified, and P-glycoprotein functionality was established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the effects of hot-air drying conditions on color, water holding capacity, and total phenolic content of dried apple were investigated using artificial neural network as an intelligent modeling system. After that, a genetic algorithm was used to optimize the drying conditions. Apples were dried at different temperatures (40, 60, and 80 °C) and at three air flow-rates (0.5, 1, and 1.5 m/s). Applying the leave-one-out cross validation methodology, simulated and experimental data were in good agreement presenting an error < 2.4 %. Quality index optimal values were found at 62.9 °C and 1.0 m/s using genetic algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduces heat demand forecasting models which are generated by using data mining algorithms. The forecast spans one full day and this forecast can be used in regulating heat consumption of buildings. For training the data mining models, two years of heat consumption data from a case building and weather measurement data from Finnish Meteorological Institute are used. The thesis utilizes Microsoft SQL Server Analysis Services data mining tools in generating the data mining models and CRISP-DM process framework to implement the research. Results show that the built models can predict heat demand at best with mean average percentage errors of 3.8% for 24-h profile and 5.9% for full day. A deployment model for integrating the generated data mining models into an existing building energy management system is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines the concept of social entrepreneurship which is a fairly new business model. In the field of business it has become increasingly popular in recent years. The growing awareness of the environment and concrete examples of impact created by social entrepreneurship have encouraged entrepreneurs to address social problems. Society’s failures are tried to redress as a result of business activities. The purpose of doing business is necessarily no longer generating just profits but business is run in order to make a social change with the profit gained from the operations. Successful social entrepreneurship requires a specific nature, constant creativity and strong desire to make a social change. It requires constant balancing between two major objectives: both financial and non-financial issues need to be considered, but not at the expense of another. While aiming at the social purpose, the business needs to be run in highly competitive markets. Therefore, both factors need equally be integrated into an organization as they are complementary, not exclusionary. Business does not exist without society and society cannot go forward without business. Social entrepreneurship, its value creation, measurement tools and reporting practices are under discussion in this research. An extensive theoretical basis is covered and used to support the findings coming out of the researched case enterprises. The most attention is focused on the concept of Social Return on Investment. The case enterprises are analyzed through the SROI process. Social enterprises are mostly small or medium sized. Naturally this sets some limitations in implementing measurement tools. The question of resources requires the most attention and therefore sets the biggest constraints. However, the size of the company does not determine all – the nature of business and the type of social purpose need to be considered always. The mission may be so concrete and transparent that in all cases any kind of measurement would be useless. Implementing measurement tools may be of great benefit – or a huge financial burden. Thus, the very first thing to carefully consider is the possible need of measuring value creation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time series analysis has gone through different developmental stages before the current modern approaches. These can broadly categorized as the classical time series analysis and modern time series analysis approach. In the classical one, the basic target of the analysis is to describe the major behaviour of the series without necessarily dealing with the underlying structures. On the contrary, the modern approaches strives to summarize the behaviour of the series going through its underlying structure so that the series can be represented explicitly. In other words, such approach of time series analysis tries to study the series structurally. The components of the series that make up the observation such as the trend, seasonality, regression and disturbance terms are modelled explicitly before putting everything together in to a single state space model which give the natural interpretation of the series. The target of this diploma work is to practically apply the modern approach of time series analysis known as the state space approach, more specifically, the dynamic linear model, to make trend analysis over Ionosonde measurement data. The data is time series of the peak height of F2 layer symbolized by hmF2 which is the height of high electron density. In addition, the work also targets to investigate the connection between solar activity and the peak height of F2 layer. Based on the result found, the peak height of the F2 layer has shown a decrease during the observation period and also shows a nonlinear positive correlation with solar activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach to the determination of the thermal parameters of high-power batteries is introduced here. Application of local heat flux measurement with a gradient heat flux sensor (GHFS) allows determination of the cell thermal parameters in di_erent surface points of the cell. The suggested methodology is not cell destructive as it does not require deep discharge of the cell or application of any charge/discharge cycles during measurements of the thermal parameters of the cell. The complete procedure is demonstrated on a high-power Li-ion pouch cell, and it is verified on a sample with well-known thermal parameters. A comparison of the experimental results with conventional thermal characterization methods shows an acceptably low error. The dependence of the cell thermal parameters on state of charge (SoC) and measurement points on the surface was studied by the proposed measurement approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper builds up from a review of some expected, but other quite surprising results regarding country estimates for the year 2000 of genuine saving, a sustainability indicator developed by a World Bank research team. We examine this indicator, founded on neoclassical welfare theory, and discuss one of its major problems. Theoretical developments from ecological economics are then considered, together with insights from Georgescu-Roegen's approach to the production process, in search for an alternative approach. A model with potentially fruitful contributions in this direction is reviewed; it points the course efforts could take enable sustainability evaluations based on a more realistic set of interrelated monetary and biophysical indicators.