904 resultados para Distributed measurement and control
Resumo:
Mathematics Subject Classification: 26A33, 93C83, 93C85, 68T40
Resumo:
Measuring and compensating the pivot points of five-axis machine tools is always challenging and very time consuming. This paper presents a newly developed approach for automatic measurement and compensation of pivot point positional errors on five-axis machine tools. Machine rotary axis errors are measured using a circular test. This method has been tested on five-axis machine tools with swivel table configuration. Results show that up to 99% of the positional errors of the rotary axis can be compensated by using this approach.
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
Video streaming via Transmission Control Protocol (TCP) networks has become a popular and highly demanded service, but its quality assessment in both objective and subjective terms has not been properly addressed. In this paper, based on statistical analysis a full analytic model of a no-reference objective metric, namely pause intensity (PI), for video quality assessment is presented. The model characterizes the video playout buffer behavior in connection with the network performance (throughput) and the video playout rate. This allows for instant quality measurement and control without requiring a reference video. PI specifically addresses the need for assessing the quality issue in terms of the continuity in the playout of TCP streaming videos, which cannot be properly measured by other objective metrics such as peak signal-to-noise-ratio, structural similarity, and buffer underrun or pause frequency. The performance of the analytical model is rigidly verified by simulation results and subjective tests using a range of video clips. It is demonstrated that PI is closely correlated with viewers' opinion scores regardless of the vastly different composition of individual elements, such as pause duration and pause frequency which jointly constitute this new quality metric. It is also shown that the correlation performance of PI is consistent and content independent. © 2013 IEEE.
Resumo:
Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.
Resumo:
The solubility of telmisartan (form A) in nine organic solvents (chloroform, dichloromethane, ethanol, toluene, benzene, 2-propanol, ethyl acetate, methanol and acetone) was determined by a laser monitoring technique at temperatures from 277.85 to 338.35 K. The solubility of telmisartan (form A) in all of the nine solvents increased with temperature as did the rates at which the solubility increased except in chloroform and dichloromethane. The mole fraction solubility in chloroform is higher than that in dichloromethane, which are both one order of magnitude higher than those in the other seven solvents at the experimental temperatures. The solubility data were correlated with the modified Apelblat equation and λh equations. The results show that the λh equation is in better agreement with the experimental data than the Apelblat equation. The relative root mean square deviations (σ) of the λh equation are in the range from 0.004 to 0.45 %. The dissolution enthalpies, entropies and Gibbs energies of telmisartan in these solvents were estimated by the Van’t Hoff equation and the Gibbs equation. The melting point and the fusion enthalpy of telmisartan were determined by differential scanning calorimetry.
Resumo:
Tanulmányunkban a hazai vállalatok teljesítménymérési és teljesítménymenedzsment gyakorlatát vizsgáljuk a Versenyben a világgal kutatási program adatainak felhasználásával. Célunk a döntéstámogatás hátterének vizsgálata: a vállalatok teljesítménymérési gyakorlatának jellemzése, konzisztenciájának értékelése, vizsgálva a korábbi kutatásaink során megfigyelt tendenciák további alakulását is. A vállalatvezetők által fontosnak/hasznosnak tartott, illetve rendszeresen használt információforrásokat, teljesítménymutatókat, elemzési eszközöket a korábbi kutatásainkhoz kialakított elemzési keret (orientáció, egyensúly, konzisztencia, támogató szerep) felhasználásával értékeltük. Az információs rendszer különböző tevékenységeket támogató szerepének az értékelése során a különböző területekért felelős vezetők véleményét is összevetettük, s különböző vállalatcsoportok sajátosságait is vizsgáltuk. --------- The paper analyses the performance measurement and performance management practice of Hungarian companies, based on data of the Competitiveness research program. Our goal was to evaluate the practice from the point of view of decision support, based on our previous framework, evaluating the orientation, the balance, the consistency and the supporting role of the performance measurement practice.
Resumo:
Bármennyire szeretne is egy bank (vállalat, biztosító) csak az üzletre koncentrálni, nem térhet ki a pénzügyi (hitel-, piaci, operációs, egyéb) kockázatok elől, amelyeket mérnie és fedeznie kell. A teljes fedezés vagy nagyon költséges, vagy nem is lehetséges, így a csőd elkerülésre minden gazdálkodó egységnek tartania kell valamennyi kockázatmentes, likvid tőkét. Koherens kockázatmérésre van szükség: az allokált tőkének tükröznie kell a kockázatokat - azonban még akkor is felmerül elosztási probléma, ha jól tudjuk mérni azokat. A diverzifikációs hatásoknak köszönhetően egy portfólió teljes kockázata általában kisebb, mint a portfóliót alkotó alportfóliók kockázatának összege. A koherens tőkeallokáció során azzal a kérdéssel kell foglalkoznunk, hogy mennyi tőkét osszunk az alportfóliókra, vagyis hogyan osszuk el „korrekt” módon a diverzifikáció előnyeit. Így megkapjuk az eszközök kockázathoz való hozzájárulását. A tanulmányban játékelmélet alkalmazásával, összetett opciós példákon keresztül bemutatjuk a kockázatok következetes mérését és felosztását, felhívjuk a figyelmet a következetlenségek veszélyeire, valamint megvizsgáljuk, hogy a gyakorlatban alkalmazott kockázatmérési módszerek [különösen a kockáztatott érték (VaR)] mennyire felelnek meg az elmélet által szabott követelményeknek. ____________________ However much a bank (or company or insurance provider) concentrates only on business, it cannot avoid financial (credit, market, operational or other) risks that need to be measured and covered. Total cover is either very expensive or not even possible, so that every business unit has to hold some risk-free liquid capital to avoid insolvency. What it needs is coherent risk measurement: the capital allocated has to match the risks, but even if the risks are measured well, distribution problems can still arise. Thanks to diversification effects, the total risk of a portfolio is less than the sum of the risks of its sub-portfolios. Coherent capital allocation entails addressing the question of how much capital to divide among the sub-portfolios, or how to distribute ‘correctly’ the advantages of diversification. This yields the contribution of the assets to the risk. The study employs game theory and examples of compound options to demonstrate coherent measurement and distribution of risks. Attention is drawn to the dangers of inconsistencies. The authors examine how far the methods of risk measurement applied in practice (notably VaR—value at risk) meet the requirements set in theory.
Resumo:
The author of this paper reviewing the perceptions of competitiveness reveals the origin and ambiguity of the concept of “national competitiveness” which is mostly confused with that of development of countries and competitiveness of their enterprises. He investigates the role of transnational companies and governments in shaping the world economic position of countries, presents a critique on the measurement of “national competitiveness” of countries, and heavily opposes the ideological use of the latter for justifying antisocial measures.
Resumo:
A cikk kiindulópontja az a tény, hogy a számvitel, azon belül is a pénzügyi beszámolás alapvető feladata döntésekhez hasznosítható információk nyújtása a vállalkozásokkal kapcsolatba kerülő érintettek számára. A gazdasági jelenségek leképezése, számviteli transzformációja során létrejövő adatok információként való hasznosításának feltétele, hogy a pénzügyi kimutatások felhasználói tisztában legyenek a leképezés mögöttes feltételezéseivel. A cikk első része a mérés általános definíciójából kiindulva mutatja be a számviteli mérés és értékelés fogalmát, ezek összefüggését, alapvető jellemzőit. Ezt követően a pénzügyi beszámolásban jelenleg érvényesülő értékelési keretrendszert vázolja fel a nemzetközi (IFRS), illetve a magyar szabályozásból kiindulva. A cikk harmadik része a szabályozás mögött meghúzódó elméleti összefüggéseket vizsgálja, kitérve a számviteli mérés és a pénzügyi teljesítmény (jövedelem) kapcsolatára, valamint bemutatja és értékeli a számviteli méréssel kapcsolatos főbb kritikákat. ____ One of the central problems of accounting theory and accounting regulation is accounting valuation, accounting as a value assignment aspect of the representation of economic phenomena. The first part of the article, setting out from the general concept of measurement, introduces the concepts of measurement and valuation as applied in accounting, describing their interconnections and basic characteristics. Following this, based on the international (IFRS) and Hungarian regulations, the paper sketches the current valuation framework used in financial reporting. The third part of the article analyses the theoretical background of the effective regulation, while also covering the connection of accounting measurement and financial performance (income), and finally it presents and evaluates the main elements of criticism concerning measurement in accounting.
Resumo:
Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^
Resumo:
A number of factors influence the information processing needs of organizations, particularly with respect to the coordination and control mechanisms within a hotel. The authors use a theoretical framework to illustrate alternative mechanisms that can be used to coordinate and control hotel operations.
Resumo:
The eggs of the dengue fever vector Aedes aegypti possess the ability to undergo an extended quiescence period hosting a fully developed first instar larvae within its chorion. As a result of this life history stage, pharate larvae can withstand months of dormancy inside the egg where they depend on stored reserves of maternal origin. This adaptation known as pharate first instar quiescence, allows A. aegypti to cope with fluctuations in water availability. An examination of this fundamental adaptation has shown that there are trade-offs associated with it. ^ Aedes aegypti mosquitoes are frequently associated with urban habitats that may contain metal pollution. My research has demonstrated that the duration of this quiescence and the extent of nutritional depletion associated with it affects the physiology and survival of larvae that hatch in a suboptimal habitat; nutrient reserves decrease during pharate first instar quiescence and alter subsequent larval and adult fitness. The duration of quiescence compromises metal tolerance physiology and is coupled to a decrease in metallothionein mRNA levels. My findings also indicate that even low levels of environmentally relevant larval metal stress alter the parameters that determine vector capacity. ^ My research has also demonstrated that extended pharate first instar quiescence can elicit a plastic response resulting in an adult phenotype distinct from adults reared from short quiescence eggs. Extended pharate first instar quiescence affects the performance and reproductive fitness of the adult female mosquito as well as the nutritional status of its progeny via maternal effects in an adaptive manner, i.e., anticipatory phenotypic plasticity results as a consequence of the duration of pharate first instar quiescence and alternative phenotypes may exist for this mosquito with quiescence serving as a cue possibly signaling the environmental conditions that follow a dry period. M findings may explain, in part, A. aegypti's success as a vector and its geographic distribution and have implications for its vector capacity and control.^
Resumo:
Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.