895 resultados para Measurement based model identification
Resumo:
Quantitative descriptive analysis (QDA) is used to describe the nature and the intensity of sensory properties from a single evaluation of a product, whereas temporal dominance of sensation (TDS) is primarily used to identify dominant sensory properties over time. Previous studies with TDS have focused on model systems, but this is the first study to use a sequential approach, i.e. QDA then TDS in measuring sensory properties of a commercial product category, using the same set of trained assessors (n = 11). The main objectives of this study were to: (1) investigate the benefits of using a sequential approach of QDA and TDS and (2) to explore the impact of the sample composition on taste and flavour perceptions in blackcurrant squashes. The present study has proposed an alternative way of determining the choice of attributes for TDS measurement based on data obtained from previous QDA studies, where available. Both methods indicated that the flavour profile was primarily influenced by the level of dilution and complexity of sample composition combined with blackcurrant juice content. In addition, artificial sweeteners were found to modify the quality of sweetness and could also contribute to bitter notes. Using QDA and TDS in tandem was shown to be more beneficial than each just on its own enabling a more complete sensory profile of the products.
Resumo:
The existing seismic isolation systems are based on well-known and accepted physical principles, but they are still having some functional drawbacks. As an attempt of improvement, the Roll-N-Cage (RNC) isolator has been recently proposed. It is designed to achieve a balance in controlling isolator displacement demands and structural accelerations. It provides in a single unit all the necessary functions of vertical rigid support, horizontal flexibility with enhanced stability, resistance to low service loads and minor vibration, and hysteretic energy dissipation characteristics. It is characterized by two unique features that are a self-braking (buffer) and a self-recentering mechanism. This paper presents an advanced representation of the main and unique features of the RNC isolator using an available finite element code called SAP2000. The validity of the obtained SAP2000 model is then checked using experimental, numerical and analytical results. Then, the paper investigates the merits and demerits of activating the built-in buffer mechanism on both structural pounding mitigation and isolation efficiency. The paper addresses the problem of passive alleviation of possible inner pounding within the RNC isolator, which may arise due to the activation of its self-braking mechanism under sever excitations such as near-fault earthquakes. The results show that the obtained finite element code-based model can closely match and accurately predict the overall behavior of the RNC isolator with effectively small errors. Moreover, the inherent buffer mechanism of the RNC isolator could mitigate or even eliminate direct structure-tostructure pounding under severe excitation considering limited septation gaps between adjacent structures. In addition, the increase of inherent hysteretic damping of the RNC isolator can efficiently limit its peak displacement together with the severity of the possibly developed inner pounding and, therefore, alleviate or even eliminate the possibly arising negative effects of the buffer mechanism on the overall RNC-isolated structural responses.
Resumo:
We propose a general procedure for solving incomplete data estimation problems. The procedure can be used to find the maximum likelihood estimate or to solve estimating equations in difficult cases such as estimation with the censored or truncated regression model, the nonlinear structural measurement error model, and the random effects model. The procedure is based on the general principle of stochastic approximation and the Markov chain Monte-Carlo method. Applying the theory on adaptive algorithms, we derive conditions under which the proposed procedure converges. Simulation studies also indicate that the proposed procedure consistently converges to the maximum likelihood estimate for the structural measurement error logistic regression model.
Resumo:
Trabalho apresentado na Conferência CPE-POWERENG 2016, 29 junho a 01 de julho 2016, Bydgoszcz, Polónia
Resumo:
Previous research shows that correlations tend to increase in magnitude when individuals are aggregated across groups. This suggests that uncorrelated constellations of personality variables (such as the primary scales of Extraversion and Neuroticism) may display much higher correlations in aggregate factor analysis. We hypothesize and report that individual level factor analysis can be explained in terms of Giant Three (or Big Five) descriptions of personality, whereas aggregate level factor analysis can be explained in terms of Gray's physiological based model. Although alternative interpretations exist, aggregate level factor analysis may correctly identify the basis of an individual's personality as a result of better reliability of measures due to aggregation. We discuss the implications of this form of analysis in terms of construct validity, personality theory, and its applicability in general. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Queensland fruit fly, Bactrocera (Dacus) tryoni (QFF) is arguably the most costly horticultural insect pest in Australia. Despite this, no model is available to describe its population dynamics and aid in its management. This paper describes a cohort-based model of the population dynamics of the Queensland fruit fly. The model is primarily driven by weather variables, and so can be used at any location where appropriate meteorological data are available. In the model, the life cycle is divided into a number of discreet stages to allow physiological processes to be defined as accurately as possible. Eggs develop and hatch into larvae, which develop into pupae, which emerge as either teneral females or males. Both females and males can enter reproductive and over-wintering life stages, and there is a trapped male life stage to allow model predictions to be compared with trap catch data. All development rates are temperature-dependent. Daily mortality rates are temperature-dependent, but may also be influenced by moisture, density of larvae in fruit, fruit suitability, and age. Eggs, larvae and pupae all have constant establishment mortalities, causing a defined proportion of individuals to die upon entering that life stage. Transfer from one immature stage to the next is based on physiological age. In the adult life stages, transfer between stages may require additional and/or alternative functions. Maximum fecundity is 1400 eggs per female per day, and maximum daily oviposition rate is 80 eggs/female per day. The actual number of eggs laid by a female on any given day is restricted by temperature, density of larva in fruit, suitability of fruit for oviposition, and female activity. Activity of reproductive females and males, which affects reproduction and trapping, decreases with rainfall. Trapping of reproductive males is determined by activity, temperature and the proportion of males in the active population. Limitations of the model are discussed. Despite these, the model provides a useful agreement with trap catch data, and allows key areas for future research to be identified. These critical gaps in the current state of knowledge exist despite over 50 years of research on this key pest. By explicitly attempting to model the population dynamics of this pest we have clearly identified the research areas that must be addressed before progress can be made in developing the model into an operational tool for the management of Queensland fruit fly. (C) 2003 Published by Elsevier B.V.
Resumo:
We show how the measurement induced model of quantum computation proposed by Raussendorf and Briegel ( 2001, Phys. Rev. Letts., 86, 5188) can be adapted to a nonlinear optical interaction. This optical implementation requires a Kerr nonlinearity, a single photon source, a single photon detector and fast feed forward. Although nondeterministic optical quantum information proposals such as that suggested by KLM ( 2001, Nature, 409, 46) do not require a Kerr nonlinearity they do require complex reconfigurable optical networks. The proposal in this paper has the benefit of a single static optical layout with fixed device parameters, where the algorithm is defined by the final measurement procedure.
Resumo:
The particle-based lattice solid model developed to study the physics of rocks and the nonlinear dynamics of earthquakes is refined by incorporating intrinsic friction between particles. The model provides a means for studying the causes of seismic wave attenuation, as well as frictional heat generation, fault zone evolution, and localisation phenomena. A modified velocity-Verlat scheme that allows friction to be precisely modelled is developed. This is a difficult computational problem given that a discontinuity must be accurately simulated by the numerical approach (i.e., the transition from static to dynamical frictional behaviour). This is achieved using a half time step integration scheme. At each half time step, a nonlinear system is solved to compute the static frictional forces and states of touching particle-pairs. Improved efficiency is achieved by adaptively adjusting the time step increment, depending on the particle velocities in the system. The total energy is calculated and verified to remain constant to a high precision during simulations. Numerical experiments show that the model can be applied to the study of earthquake dynamics, the stick-slip instability, heat generation, and fault zone evolution. Such experiments may lead to a conclusive resolution of the heat flow paradox and improved understanding of earthquake precursory phenomena and dynamics. (C) 1999 Academic Press.
Resumo:
Ultra wideband (UWB) radar has been extensively investigated both theoretically and practically for the identification buried artifacts. Ground probe radar (GPR) concentrates on the identification of lightly buried land mines, unexploded ordnance (UXO) and archeological targets. The same technology is proposed in a similar context for the rapid identification of in vivo implanted metallic prostheses. The technique is based on resonance based target identification and the paper investigates UWB scattering from a metallic hip prosthesis in free space as a first step in the identification process.
Resumo:
Domain specific information retrieval has become in demand. Not only domain experts, but also average non-expert users are interested in searching domain specific (e.g., medical and health) information from online resources. However, a typical problem to average users is that the search results are always a mixture of documents with different levels of readability. Non-expert users may want to see documents with higher readability on the top of the list. Consequently the search results need to be re-ranked in a descending order of readability. It is often not practical for domain experts to manually label the readability of documents for large databases. Computational models of readability needs to be investigated. However, traditional readability formulas are designed for general purpose text and insufficient to deal with technical materials for domain specific information retrieval. More advanced algorithms such as textual coherence model are computationally expensive for re-ranking a large number of retrieved documents. In this paper, we propose an effective and computationally tractable concept-based model of text readability. In addition to textual genres of a document, our model also takes into account domain specific knowledge, i.e., how the domain-specific concepts contained in the document affect the document’s readability. Three major readability formulas are proposed and applied to health and medical information retrieval. Experimental results show that our proposed readability formulas lead to remarkable improvements in terms of correlation with users’ readability ratings over four traditional readability measures.
Resumo:
In this paper we propose an alternative method for measuring efficiency of Decision making Units, which allows the presence of variables with both negative and positive values. The model is applied to data on the notional effluent processing system to compare the results with recent developed methods; Modified Slacks Based Model as suggested by Sharp et al (2007) and Range Directional Measures developed by Silva Portela et al (2004). A further example explores advantages of using the new model.
Resumo:
Purpose - The purpose of this study is to develop a performance measurement model for service operations using the analytic hierarchy process approach. Design/methodology/approach - The study reviews current relevant literature on performance measurement and develops a model for performance measurement. The model is then applied to the intensive care units (ICUs) of three different hospitals in developing nations. Six focus group discussions were undertaken, involving experts from the specific area under investigation, in order to develop an understandable performance measurement model that was both quantitative and hierarchical. Findings - A combination of outcome, structure and process-based factors were used as a foundation for the model. The analyses of the links between them were used to reveal the relative importance of each and their associated sub factors. It was considered to be an effective quantitative tool by the stakeholders. Research limitations/implications - This research only applies the model to ICUs in healthcare services. Practical implications - Performance measurement is an important area within the operations management field. Although numerous models are routinely being deployed both in practice and research, there is always room for improvement. The present study proposes a hierarchical quantitative approach, which considers both subjective and objective performance criteria. Originality/value - This paper develops a hierarchical quantitative model for service performance measurement. It considers success factors with respect to outcomes, structure and processes with the involvement of the concerned stakeholders based upon the analytic hierarchy process approach. The unique model is applied to the ICUs of hospitals in order to demonstrate its effectiveness. The unique application provides a comparative international study of service performance measurement in ICUs of hospitals in three different countries. © Emerald Group Publishing Limited.
Resumo:
We have attempted to bring together two areas which are challenging for both IS research and practice: forms of coordination and management of knowledge in the context of global, virtual software development projects. We developed a more comprehensive, knowledge-based model of how coordination can be achieved, and\illustrated the heuristic and explanatory power of the model when applied to global software projects experiencing different degrees of success. We first reviewed the literature on coordination and determined what is known about coordination of knowledge in global software projects. From this we developed a new, distinctive knowledge-based model of coordination, which was then employed to analyze two case studies of global software projects, at SAP and Baan, to illustrate the utility of the model.
Resumo:
Purpose: Short product life cycle and/or mass customization necessitate reconfiguration of operational enablers of supply chain (SC) from time to time in order to harness high levels of performance. The purpose of this paper is to identify the key operational enablers under stochastic environment on which practitioner should focus while reconfiguring a SC network. Design/methodology/approach: The paper used interpretive structural modeling (ISM) approach that presents a hierarchy-based model and the mutual relationships among the enablers. The contextual relationship needed for developing structural self-interaction matrix (SSIM) among various enablers is realized by conducting experiments through simulation of a hypothetical SC network. Findings: The research identifies various operational enablers having a high driving power towards assumed performance measures. In this regard, these enablers require maximum attention and of strategic importance while reconfiguring SC. Practical implications: ISM provides a useful tool to the SC managers to strategically adopt and focus on the key enablers which have comparatively greater potential in enhancing the SC performance under given operational settings. Originality/value: The present research realizes the importance of SC flexibility under the premise of reconfiguration of the operational units in order to harness high value of SC performance. Given the resulting digraph through ISM, the decision maker can focus the key enablers for effective reconfiguration. The study is one of the first efforts that develop contextual relations among operational enablers for SSIM matrix through integration of discrete event simulation to ISM. © Emerald Group Publishing Limited.
Resumo:
Respiratory-volume monitoring is an indispensable part of mechanical ventilation. Here we present a new method of the respiratory-volume measurement based on a single fibre-optical long-period sensor of bending and the correlation between torso curvature and lung volume. Unlike the commonly used air-flow based measurement methods the proposed sensor is drift-free and immune to air-leaks. In the paper, we explain the working principle of sensors, a two-step calibration-test measurement procedure and present results that establish a linear correlation between the change in the local thorax curvature and the change of the lung volume. We also discuss the advantages and limitations of these sensors with respect to the current standards. © 2013 IEEE.