904 resultados para Measurement-based
Resumo:
Indicators are widely used by organizations as a way of evaluating, measuring and classifying organizational performance. As part of performance evaluation systems, indicators are often shared or compared across internal sectors or with other organizations. However, indicators can be vague and imprecise, and also can lack semantics, making comparisons with other indicators difficult. Thus, this paper presents a knowledge model based on an ontology that may be used to represent indicators semantically and generically, dealing with the imprecision and vagueness, and thus facilitating better comparison. Semantic technologies are shown to be suitable for this solution, so that it could be able to represent complex data involved in indicators comparison.
Resumo:
Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.
Resumo:
In this study, we developed a DEA-based performance measurement methodology that is consistent with performance assessment frameworks such as the Balanced Scorecard. The methodology developed in this paper takes into account the direct or inverse relationships that may exist among the dimensions of performance to construct appropriate production frontiers. The production frontiers we obtained are deemed appropriate as they consist solely of firms with desirable levels for all dimensions of performance. These levels should be at least equal to the critical values set by decision makers. The properties and advantages of our methodology against competing methodologies are presented through an application to a real-world case study from retail firms operating in the US. A comparative analysis between the new methodology and existing methodologies explains the failure of the existing approaches to define appropriate production frontiers when directly or inversely related dimensions of performance are present and to express the interrelationships between the dimensions of performance.
Resumo:
Conventional tools for measurement of laser spectra (e.g. optical spectrum analysers) capture data averaged over a considerable time period. However, the generation spectrum of many laser types may involve spectral dynamics whose relatively fast time scale is determined by their cavity round trip period, calling for instrumentation featuring both high temporal and spectral resolution. Such real-time spectral characterisation becomes particularly challenging if the laser pulses are long, or they have continuous or quasi-continuous wave radiation components. Here we combine optical heterodyning with a technique of spatiooral intensity measurements that allows the characterisation of such complex sources. Fast, round-trip-resolved spectral dynamics of cavity-based systems in real-time are obtained, with temporal resolution of one cavity round trip and frequency resolution defined by its inverse (85 ns and 24 MHz respectively are demonstrated). We also show how under certain conditions for quasi-continuous wave sources, the spectral resolution could be further increased by a factor of 100 by direct extraction of phase information from the heterodyned dynamics or by using double time scales within the spectrogram approach.
Resumo:
Tanulmányunkban a hazai vállalatok teljesítménymérési és teljesítménymenedzsment gyakorlatát vizsgáljuk a Versenyben a világgal kutatási program adatainak felhasználásával. Célunk a döntéstámogatás hátterének vizsgálata: a vállalatok teljesítménymérési gyakorlatának jellemzése, konzisztenciájának értékelése, vizsgálva a korábbi kutatásaink során megfigyelt tendenciák további alakulását is. A vállalatvezetők által fontosnak/hasznosnak tartott, illetve rendszeresen használt információforrásokat, teljesítménymutatókat, elemzési eszközöket a korábbi kutatásainkhoz kialakított elemzési keret (orientáció, egyensúly, konzisztencia, támogató szerep) felhasználásával értékeltük. Az információs rendszer különböző tevékenységeket támogató szerepének az értékelése során a különböző területekért felelős vezetők véleményét is összevetettük, s különböző vállalatcsoportok sajátosságait is vizsgáltuk. --------- The paper analyses the performance measurement and performance management practice of Hungarian companies, based on data of the Competitiveness research program. Our goal was to evaluate the practice from the point of view of decision support, based on our previous framework, evaluating the orientation, the balance, the consistency and the supporting role of the performance measurement practice.
Resumo:
A lean menedzsment egészségügyi szolgáltatásokra való alkalmazásával elérhető eredmények egyre inkább nyilvánvalóvá válnak. Ennek köszönhetően a szektorban dinamikus növekedés tapasztalható ezen a téren. A kutatások azonban arra hívják fel a figyelmet, hogy a lean menedzsment alkalmazásával elért eredmények csak akkor lesznek fenntarthatóak, ha az eszközök alkalmazását a kultúra átalakulása is követi. A kultúra változásának követéséhez annak folyamatos értékelésére van szükség. A szervezeti kultúra lean-specifikus méréséhez azonban – a szerzők tudomása szerint – még nincs kidolgozott eszköz. Ezért cikkükben a kapcsolódó szakirodalom áttekintése után kidolgoztak egy lean kultúra kérdőívet, majd bemutatják a kérdőív tesztelését és annak eredményeit. Összegzésként elmondható, hogy az itt bemutatott kérdőív az első tesztelés alapján további fejlesztésre szorul. / === / The results that can be obtained by applying lean management in healthcare services become more and more clear. This generates a dynamic increase of lean applications in healthcare. However, researches are warning that the res ults obtained by lean applications can only be sustained, if next to the use of the lean tools cultural change will also take place. In order to track changes in culture its constant evaluation is necessary. According to the authors’ knowledge today does not exist any lean-specific culture evaluation tool. In this paper they elaborate a lean culture questionnaire based on the review of relevant literature. Than they describe its test and the results of the test. The authors conclude that the questionnaire as introduced here needs further improvement.
Resumo:
Tanulmányunkban a hazai vállalatok teljesítménymérési és teljesítménymenedzsment gyakorlatát vizsgáljuk a Versenyben a világgal kutatási program 2009. évi felmérése adatainak felhasználásával. Célunk a döntéstámogatás hátterének vizsgálata: a vállalatok teljesítménymérési gyakorlatának jellemzése, konzisztenciájának értékelése, vizsgálva a korábbi (1996, 1999 és 2004 évi hasonló) kutatásaink során megfigyelt tendenciák további alakulását is. A vállalati teljesítménymérés gyakorlatát, a vállalatvezetők által fontosnak/hasznosnak tartott, illetve rendszeresen használt információforrásokat, teljesítménymutatókat, elemzési eszközöket a korábbi kutatásainkhoz kialakított elemzési keret (orientáció, egyensúly, konzisztencia, támogató szerep) felhasználásával értékeltük. Az információs rendszer különböző tevékenységeket támogató szerepének az értékelése során a különböző területekért felelős vezetők véleményét is összevetettük, s különböző vállalati jellemzők (vállalatméret, tulajdonosok típusa, fő tevékenység stb.) sajátosságait is vizsgáltuk. ___________ The paper analyses the performance measurement and performance management practice of Hungarian companies, based on the data of the Competitiveness research program (2009). Our goal was to evaluate the practice from the point of view of decision support, based on our previous framework, evaluating the orientation, the balance, the consistency and the supporting role of the performance measurement practice.
Resumo:
A versenyképes működés elengedhetetlen feltétele a fogyasztói elégedettség, melynek egyik meghatározó eleme az észlelt és elvárt minőség közti kapcsolat. A minőségi elvárások az internettel, mint napjaink egyik meghatározó csatornájával kapcsolatban is megfogalmazódtak már, így kapott jelentős szerepet az online szolgáltatás minőségének meghatározása, illetve ezzel összekapcsolódva az online fogyasztói elégedettségmérés. Jelen tanulmány első része szakirodalmi áttekintést nyújt az online szolgáltatás minőségének fogyasztói érzékelésével, értékelésével kapcsolatos elméletekről, melyek az online fogyasztók elégedettségmérésének alapját képezik. Ezután kerül sor a különböző mérési módszerek bemutatására, kiemelt szerepet szánva a szakirodalomban sokat tárgyalt E-S-QUAL és E-RecS-QUAL skálának. Az áttekintés középpontjában azok az elméletek állnak, melyek az online vásárlást is nyújtó honlapokra vonatkoznak. A cikk további része két empirikus kutatást tartalmaz. Az első az elégedettségmérés hazai helyzetét tárgyalja, a másik pedig a szakirodalomból ismert E-S-QUAL és E-RecS-QUAL skálákat felhasználva részletesen elemzi az elektronikus szolgáltatásminőség dimenziói és a fogyasztói elégedettség közötti fontosabb összefüggéseket, emellett röviden vizsgálja az alkalmazott skálák megbízhatóságát és érvényességét. A kutatás fő célja a gyakorlati szakemberek számára is releváns kapcsolatok feltárása és bemutatása. _______ A company’s competitiveness significantly depends on the satisfaction of its consumers, which is influenced by the relationship between the expected and perceived quality. As over the last decade internet has become a significant channel, in parallel its customers have built some expectation about the quality of different websites and online services. Therefore the marketing literature should focus on the dimensions of e-service quality (e-sq) and online-customer satisfaction. This study first resumes with different concepts of e-sq, which are the principles of the online-customer satisfaction’s measurement. Than the different e-sq measurement methods are reviewed and the dimensions of E-S-QUAL and ERecS- QUAL scales are deeply explained. The study is focusing on concepts relating to web shops. The next part of the article includes two empirical studies. The first is about the situation of satisfaction measurement at Hungarian companies. The other one analyzes the most relevant relationships between the dimension of e-sq and customer satisfaction based on the E-S-QUAL and E-RecS-QUAL scales for measuring e-sq and briefly outlines the reliability and validity of these scales. The main purpose of this empirical research is to summarise the managerial and practical implications.
Resumo:
A cikk kiindulópontja az a tény, hogy a számvitel, azon belül is a pénzügyi beszámolás alapvető feladata döntésekhez hasznosítható információk nyújtása a vállalkozásokkal kapcsolatba kerülő érintettek számára. A gazdasági jelenségek leképezése, számviteli transzformációja során létrejövő adatok információként való hasznosításának feltétele, hogy a pénzügyi kimutatások felhasználói tisztában legyenek a leképezés mögöttes feltételezéseivel. A cikk első része a mérés általános definíciójából kiindulva mutatja be a számviteli mérés és értékelés fogalmát, ezek összefüggését, alapvető jellemzőit. Ezt követően a pénzügyi beszámolásban jelenleg érvényesülő értékelési keretrendszert vázolja fel a nemzetközi (IFRS), illetve a magyar szabályozásból kiindulva. A cikk harmadik része a szabályozás mögött meghúzódó elméleti összefüggéseket vizsgálja, kitérve a számviteli mérés és a pénzügyi teljesítmény (jövedelem) kapcsolatára, valamint bemutatja és értékeli a számviteli méréssel kapcsolatos főbb kritikákat. ____ One of the central problems of accounting theory and accounting regulation is accounting valuation, accounting as a value assignment aspect of the representation of economic phenomena. The first part of the article, setting out from the general concept of measurement, introduces the concepts of measurement and valuation as applied in accounting, describing their interconnections and basic characteristics. Following this, based on the international (IFRS) and Hungarian regulations, the paper sketches the current valuation framework used in financial reporting. The third part of the article analyses the theoretical background of the effective regulation, while also covering the connection of accounting measurement and financial performance (income), and finally it presents and evaluates the main elements of criticism concerning measurement in accounting.
Resumo:
Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^
Resumo:
Fluorescence-enhanced optical imaging is an emerging non-invasive and non-ionizing modality towards breast cancer diagnosis. Various optical imaging systems are currently available, although most of them are limited by bulky instrumentation, or their inability to flexibly image different tissue volumes and shapes. Hand-held based optical imaging systems are a recent development for its improved portability, but are currently limited only to surface mapping. Herein, a novel optical imager, consisting primarily of a hand-held probe and a gain-modulated intensified charge coupled device (ICCD) detector, is developed towards both surface and tomographic breast imaging. The unique features of this hand-held probe based optical imager are its ability to; (i) image large tissue areas (5×10 sq. cm) in a single scan, (ii) reduce overall imaging time using a unique measurement geometry, and (iii) perform tomographic imaging for tumor three-dimensional (3-D) localization. Frequency-domain based experimental phantom studies have been performed on slab geometries (650 ml) under different target depths (1-2.5 cm), target volumes (0.45, 0.23 and 0.10 cc), fluorescence absorption contrast ratios (1:0, 1000:1 to 5:1), and number of targets (up to 3), using Indocyanine Green (ICG) as fluorescence contrast agents. An approximate extended Kalman filter based inverse algorithm has been adapted towards 3-D tomographic reconstructions. Single fluorescence target(s) was reconstructed when located: (i) up to 2.5 cm deep (at 1:0 contrast ratio) and 1.5 cm deep (up to 10:1 contrast ratio) for 0.45 cc-target; and (ii) 1.5 cm deep for target as small as 0.10 cc at 1:0 contrast ratio. In the case of multiple targets, two targets as close as 0.7 cm were tomographically resolved when located 1.5 cm deep. It was observed that performing multi-projection (here dual) based tomographic imaging using a priori target information from surface images, improved the target depth recovery over using single projection based imaging. From a total of 98 experimental phantom studies, the sensitivity and specificity of the imager was estimated as 81-86% and 43-50%, respectively. With 3-D tomographic imaging successfully demonstrated for the first time using a hand-held based optical imager, the clinical translation of this technology is promising upon further experimental validation from in-vitro and in-vivo studies.
Resumo:
The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^
Resumo:
Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^
Resumo:
Voice communication systems such as Voice-over IP (VoIP), Public Switched Telephone Networks, and Mobile Telephone Networks, are an integral means of human tele-interaction. These systems pose distinctive challenges due to their unique characteristics such as low volume, burstiness and stringent delay/loss requirements across heterogeneous underlying network technologies. Effective quality evaluation methodologies are important for system development and refinement, particularly by adopting user feedback based measurement. Presently, most of the evaluation models are system-centric (Quality of Service or QoS-based), which questioned us to explore a user-centric (Quality of Experience or QoE-based) approach as a step towards the human-centric paradigm of system design. We research an affect-based QoE evaluation framework which attempts to capture users' perception while they are engaged in voice communication. Our modular approach consists of feature extraction from multiple information sources including various affective cues and different classification procedures such as Support Vector Machines (SVM) and k-Nearest Neighbor (kNN). The experimental study is illustrated in depth with detailed analysis of results. The evidences collected provide the potential feasibility of our approach for QoE evaluation and suggest the consideration of human affective attributes in modeling user experience.
Resumo:
Major portion of hurricane-induced economic loss originates from damages to building structures. The damages on building structures are typically grouped into three main categories: exterior, interior, and contents damage. Although the latter two types of damages, in most cases, cause more than 50% of the total loss, little has been done to investigate the physical damage process and unveil the interdependence of interior damage parameters. Building interior and contents damages are mainly due to wind-driven rain (WDR) intrusion through building envelope defects, breaches, and other functional openings. The limitation of research works and subsequent knowledge gaps, are in most part due to the complexity of damage phenomena during hurricanes and lack of established measurement methodologies to quantify rainwater intrusion. This dissertation focuses on devising methodologies for large-scale experimental simulation of tropical cyclone WDR and measurements of rainwater intrusion to acquire benchmark test-based data for the development of hurricane-induced building interior and contents damage model. Target WDR parameters derived from tropical cyclone rainfall data were used to simulate the WDR characteristics at the Wall of Wind (WOW) facility. The proposed WDR simulation methodology presents detailed procedures for selection of type and number of nozzles formulated based on tropical cyclone WDR study. The simulated WDR was later used to experimentally investigate the mechanisms of rainwater deposition/intrusion in buildings. Test-based dataset of two rainwater intrusion parameters that quantify the distribution of direct impinging raindrops and surface runoff rainwater over building surface — rain admittance factor (RAF) and surface runoff coefficient (SRC), respectively —were developed using common shapes of low-rise buildings. The dataset was applied to a newly formulated WDR estimation model to predict the volume of rainwater ingress through envelope openings such as wall and roof deck breaches and window sill cracks. The validation of the new model using experimental data indicated reasonable estimation of rainwater ingress through envelope defects and breaches during tropical cyclones. The WDR estimation model and experimental dataset of WDR parameters developed in this dissertation work can be used to enhance the prediction capabilities of existing interior damage models such as the Florida Public Hurricane Loss Model (FPHLM).^