787 resultados para expert system, fuzzy logic, pan stage models, supervisory control
Resumo:
Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them being used for information systems development. In this paper, we examine two factors that we predict will influence the understanding of a business process that novice developers obtain from a corresponding process model: the content presentation form chosen to articulate the business domain, and the user characteristics of the novice developers working with the model. Our experimental study provides evidence that novice developers obtain similar levels of understanding when confronted with an unfamiliar or a familiar process model. However, previous modeling experience, the use of English as a second language, and previous work experience in BPM are important influencing factors of model understanding. Our findings suggest that education and research in process modeling should increase the focus on human factors and how they relate to content and content presentation formats for different modeling tasks. We discuss implications for practice and research.
Resumo:
Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.
Resumo:
Business practices vary from one company to another and business practices often need to be changed due to changes of business environments. To satisfy different business practices, enterprise systems need to be customized. To keep up with ongoing business practice changes, enterprise systems need to be adapted. Because of rigidity and complexity, the customization and adaption of enterprise systems often takes excessive time with potential failures and budget shortfall. Moreover, enterprise systems often drag business behind because they cannot be rapidly adapted to support business practice changes. Extensive literature has addressed this issue by identifying success or failure factors, implementation approaches, and project management strategies. Those efforts were aimed at learning lessons from post implementation experiences to help future projects. This research looks into this issue from a different angle. It attempts to address this issue by delivering a systematic method for developing flexible enterprise systems which can be easily tailored for different business practices or rapidly adapted when business practices change. First, this research examines the role of system models in the context of enterprise system development; and the relationship of system models with software programs in the contexts of computer aided software engineering (CASE), model driven architecture (MDA) and workflow management system (WfMS). Then, by applying the analogical reasoning method, this research initiates a concept of model driven enterprise systems. The novelty of model driven enterprise systems is that it extracts system models from software programs and makes system models able to stay independent of software programs. In the paradigm of model driven enterprise systems, system models act as instructors to guide and control the behavior of software programs. Software programs function by interpreting instructions in system models. This mechanism exposes the opportunity to tailor such a system by changing system models. To make this true, system models should be represented in a language which can be easily understood by human beings and can also be effectively interpreted by computers. In this research, various semantic representations are investigated to support model driven enterprise systems. The significance of this research is 1) the transplantation of the successful structure for flexibility in modern machines and WfMS to enterprise systems; and 2) the advancement of MDA by extending the role of system models from guiding system development to controlling system behaviors. This research contributes to the area relevant to enterprise systems from three perspectives: 1) a new paradigm of enterprise systems, in which enterprise systems consist of two essential elements: system models and software programs. These two elements are loosely coupled and can exist independently; 2) semantic representations, which can effectively represent business entities, entity relationships, business logic and information processing logic in a semantic manner. Semantic representations are the key enabling techniques of model driven enterprise systems; and 3) a brand new role of system models; traditionally the role of system models is to guide developers to write system source code. This research promotes the role of system models to control the behaviors of enterprise.
Resumo:
In the cancer research field, most in vitro studies still rely on two-dimensional (2D) cultures. However, the trend is rapidly shifting towards using a three-dimensional (3D) culture system. This is because 3D models better recapitulate the microenvironment of cells, and therefore, yield cellular and molecular responses that more accurately describe the pathophysiology of cancer. By adopting technology platforms established by the tissue engineering discipline, it is now possible to grow cancer cells in extracellular matrix (ECM)-like environments and dictate the biophysical and biochemical properties of the matrix. In addition, 3D models can be modified to recapitulate different stages of cancer progression for instance from the initial development of tumor to metastasis. Inevitably, to recapitulate a heterotypic condition, comprising more than one cell type, it requires a more complex 3D model. To date, 3D models that are available for studying the prostate cancer (CaP)-bone interactions are still lacking. Therefore, the aim of this study is to establish a co-culture model that allows investigation of direct and indirect CaP-bone interactions. Prior to that, 3D polyethylene glycol (PEG)-based hydrogel cultures for CaP cells were first developed and growth conditions were optimised. Characterization of the 3D hydrogel cultures show that LNCaP cells form a multicellular mass that resembles avascular tumor. In comparison to 2D cultures, besides the difference in cell morphology, the response of LNCaP cells to the androgen analogue (R1881) stimulation is different compared to the cells in 2D cultures. This discrepancy between 2D and 3D cultures is likely associated with the cell-cell contact, density and ligand-receptor interactions. Following the 3D monoculture study, a 3D direct co-culture model of CaP cells and the human tissue engineered bone (hTEBC) construct was developed. Interactions between the CaP cells and human osteoblasts (hOBs) resulted in elevation of Matrix Metalloproteinase 9 (MMP9) for PC-3 cells and Prostate Specific Antigen (PSA) for LNCaP cells. To further investigate the paracrine interaction of CaP cells and (hOBs), a 3D indirect co-culture model was developed, where LNCaP cells embedded within PEG hydrogels were co-cultured with hTEBC. It was found that the cellular changes observed reflect the early event of CaP colonizing the bone site. In the absence of androgens, interestingly, up-regulation of PSA and other kallikreins is also detected in the co-culture compared to the LNCaP monoculture. This non androgenic stimulation could be triggered by the soluble factors secreted by the hOB such as Interleukin-6. There are also decrease in alkaline phosphatase (ALP) activity and down-regulation of genes of the hOB when co-cultured with LNCaP cells that have not been previously described. These genes include transforming growth factor β1 (TGFβ1), osteocalcin and Vimentin. However, no changes to epithelial markers (e.g E-cadherin, Cytokeratin 8) were observed in both cell types from the co-culture. Some of these intriguing changes observed in the co-cultures that had not been previously described have enriched the basic knowledge of the CaP cell-bone interaction. From this study, we have shown evidence of the feasibility and versatility of our established 3D models. These models can be adapted to test various hypotheses for studies pertaining to underlying mechanisms of bone metastasis and could provide a vehicle for anticancer drug screening purposes in the future.
Resumo:
The configuration of comprehensive Enterprise Systems to meet the specific requirements of an organisation up to today is consuming significant resources. The results of failing implementation projects are severe and may even threaten the organisation’s existence. This paper proposes a method which aims at increasing the efficiency of Enterprise Systems implementations. First, we argue that existing process modelling languages that feature different degrees of abstraction for different user groups exist and are used for different purposes which makes it necessary to integrate them. We describe how to do this using the meta models of the involved languages. Second, we motivate that an integrated process model based on the integrated meta model needs to be configurable and elaborate on the mechanisms by which this model configuration can be achieved. We introduce a business example using SAP modelling techniques to illustrate the proposed method.
Resumo:
In this paper, a recently introduced model-based method for precedent-free fault detection and isolation (FDI) is modified to deal with multiple input, multiple output (MIMO) systems and is applied to an automotive engine with exhaust gas recirculation (EGR) system. Using normal behavior data generated by a high fidelity engine simulation, the growing structure multiple model system (GSMMS) approach is used to construct dynamic models of normal behavior for the EGR system and its constituent subsystems. Using the GSMMS models as a foundation, anomalous behavior is detected whenever statistically significant departures of the most recent modeling residuals away from the modeling residuals displayed during normal behavior are observed. By reconnecting the anomaly detectors (ADs) to the constituent subsystems, EGR valve, cooler, and valve controller faults are isolated without the need for prior training using data corresponding to particular faulty system behaviors.
Resumo:
Because of increased competition between healthcare providers, higher customer expectations, stringent checks on insurance payments and new government regulations, it has become vital for healthcare organisations to enhance the quality of the care they provide, to increase efficiency, and to improve the cost effectiveness of their services. Consequently, a number of quality management concepts and tools are employed in the healthcare domain to achieve the most efficient ways of using time, manpower, space and other resources. Emergency departments are designed to provide a high-quality medical service with immediate availability of resources to those in need of emergency care. The challenge of maintaining a smooth flow of patients in emergency departments is a global problem. This study attempts to improve the patient flow in emergency departments by considering Lean techniques and Six Sigma methodology in a comprehensive conceptual framework. The proposed research will develop a systematic approach through integration of Lean techniques with Six Sigma methodology to improve patient flow in emergency departments. The results reported in this paper are based on a standard questionnaire survey of 350 patients in the Emergency Department of Aseer Central Hospital in Saudi Arabia. The results of the study led us to determine the most significant variables affecting patient satisfaction with patient flow, including waiting time during patient treatment in the emergency department; effectiveness of the system when dealing with the patient’s complaints; and the layout of the emergency department. The proposed model will be developed within a performance evaluation metric based on these critical variables, to be evaluated in future work within fuzzy logic for continuous quality improvement.
Resumo:
Quality of experience (QoE) measures the overall perceived quality of mobile video delivery from subjective user experience and objective system performance. Current QoE computing models have two main limitations: 1) insufficient consideration of the factors influencing QoE, and; 2) limited studies on QoE models for acceptability prediction. In this paper, a set of novel acceptability-based QoE models, denoted as A-QoE, is proposed based on the results of comprehensive user studies on subjective quality acceptance assessments. The models are able to predict users’ acceptability and pleasantness in various mobile video usage scenarios. Statistical regression analysis has been used to build the models with a group of influencing factors as independent predictors, including encoding parameters and bitrate, video content characteristics, and mobile device display resolution. The performance of the proposed A-QoE models has been compared with three well-known objective Video Quality Assessment metrics: PSNR, SSIM and VQM. The proposed A-QoE models have high prediction accuracy and usage flexibility. Future user-centred mobile video delivery systems can benefit from applying the proposed QoE-based management to optimize video coding and quality delivery decisions.
Resumo:
Commodity price modeling is normally approached in terms of structural time-series models, in which the different components (states) have a financial interpretation. The parameters of these models can be estimated using maximum likelihood. This approach results in a non-linear parameter estimation problem and thus a key issue is how to obtain reliable initial estimates. In this paper, we focus on the initial parameter estimation problem for the Schwartz-Smith two-factor model commonly used in asset valuation. We propose the use of a two-step method. The first step considers a univariate model based only on the spot price and uses a transfer function model to obtain initial estimates of the fundamental parameters. The second step uses the estimates obtained in the first step to initialize a re-parameterized state-space-innovations based estimator, which includes information related to future prices. The second step refines the estimates obtained in the first step and also gives estimates of the remaining parameters in the model. This paper is part tutorial in nature and gives an introduction to aspects of commodity price modeling and the associated parameter estimation problem.
Resumo:
Современный этап развития комплексов автоматического управления и навигации малогабаритными БЛА многократного применения предъявляет высокие требования к автономности, точности и миниатюрности данных систем. Противоречивость требований диктует использование функционального и алгоритмического объединения нескольких разнотипных источников навигационной информации в едином вычислительном процессе на основе методов оптимальной фильтрации. Получили широкое развитие бесплатформенные инерциальные навигационные системы (БИНС) на основе комплексирования данных микромеханических датчиков инерциальной информации и датчиков параметров движения в воздушном потоке с данными спутниковых навигационных систем (СНС). Однако в современных условиях такой подход не в полной мере реализует требования к помехозащищённости, автономности и точности получаемой навигационной информации. Одновременно с этим достигли значительного прогресса навигационные системы, использующие принципы корреляционно экстремальной навигации по оптическим ориентирам и цифровым картам местности. Предлагается схема построения автономной автоматической навигационной системы (АНС) для БЛА многоразового применения на основе объединения алгоритмов БИНС, спутниковой навигационной системы и оптической навигационной системы. The modern stage of automatic control and guidance systems development for small unmanned aerial vehicles (UAV) is determined by advanced requirements for autonomy, accuracy and size of the systems. The contradictory of the requirements dictates novel functional and algorithmic tight coupling of several different onboard sensors into one computational process, which is based on methods of optimal filtering. Nowadays, data fusion of micro-electro mechanical sensors of inertial measurement units, barometric pressure sensors, and signals of global navigation satellite systems (GNSS) receivers is widely used in numerous strap down inertial navigation systems (INS). However, the systems do not fully comply with such requirements as jamming immunity, fault tolerance, autonomy, and accuracy of navigation. At the same time, the significant progress has been recently demonstrated by the navigation systems, which use the correlation extremal principle applied for optical data flow and digital maps. This article proposes a new architecture of automatic navigation management system (ANMS) for small UAV, which combines algorithms of strap down INS, satellite navigation and optical navigation system.
Resumo:
FOR SUGAR factories with cogeneration plants major changes to the process stations have been undertaken to reduce the consumption of exhaust steam from the turbines and maximise the generated power. In many cases the process steam consumption has been reduced from greater than 52% on cane to ~40% on cane. The main changes have been to install additional evaporation area at the front of the set, operate the pan stages on vapour from No 1 or No 2 effects and undertake juice heating using vapour bleed from evaporators as far down the set as the penultimate stage. Operationally, one of the main challenges has been to develop a control system for the evaporators that addresses the objectives of juice processing rate (throughput) and steam economy, while producing syrup consistently at the required brix and providing an adequate and consistent vapour pressure for the pan stage operations. The cyclic demand for vapour by batch pans causes process disturbances through the evaporator set and these must be regulated in an effective manner to satisfy the above list of objectives for the evaporator station. The impact of the cyclic pan stage vapour demand has been modelled to define the impact on juice rate, steam economy, syrup brix and head space pressures in the evaporators. Experiences with the control schemes used at Pioneer and Rocky Point Mills are discussed. For each factory the paper provides information on (a) the control system used, the philosophy behind the control system and experiences in reaching the current system for control (b) the performance of the control system to handle the disturbances imposed by the pan stage and operate within other constraints of the factory (c) deficiencies in the current system and plans for further improvements. Other processing changes to boost the performance of the evaporators are also discussed.
Resumo:
This paper reviews the use of multi-agent systems to model the impacts of high levels of photovoltaic (PV) system penetration in distribution networks and presents some preliminary data obtained from the Perth Solar City high penetration PV trial. The Perth Solar City trial consists of a low voltage distribution feeder supplying 75 customers where 29 consumers have roof top photovoltaic systems. Data is collected from smart meters at each consumer premises, from data loggers at the transformer low voltage (LV) side and from a nearby distribution network SCADA measurement point on the high voltage side (HV) side of the transformer. The data will be used to progressively develop MAS models.
Resumo:
For future planetary robot missions, multi-robot-systems can be considered as a suitable platform to perform space mission faster and more reliable. In heterogeneous robot teams, each robot can have different abilities and sensor equipment. In this paper we describe a lunar demonstration scenario where a team of mobile robots explores an unknown area and identifies a set of objects belonging to a lunar infrastructure. Our robot team consists of two exploring scout robots and a mobile manipulator. The mission goal is to locate the objects within a certain area, to identify the objects, and to transport the objects to a base station. The robots have a different sensor setup and different capabilities. In order to classify parts of the lunar infrastructure, the robots have to share the knowledge about the objects. Based on the different sensing capabilities, several information modalities have to be shared and combined by the robots. In this work we propose an approach using spatial features and a fuzzy logic based reasoning for distributed object classification.
Resumo:
We used event-related functional magnetic resonance imaging (fMRI) to investigate neural responses associated with the semantic interference (SI) effect in the picture-word task. Independent stage models of word production assume that the locus of the SI effect is at the conceptual processing level (Levelt et al. [1999]: Behav Brain Sci 22:1-75), whereas interactive models postulate that it occurs at phonological retrieval (Starreveld and La Heij [1996]: J Exp Psychol Learn Mem Cogn 22:896-918). In both types of model resolution of the SI effect occurs as a result of competitive, spreading activation without the involvement of inhibitory links. These assumptions were tested by randomly presenting participants with trials from semantically-related and lexical control distractor conditions and acquiring image volumes coincident with the estimated peak hemodynamic response for each trial. Overt vocalization of picture names occurred in the absence of scanner noise, allowing reaction time (RT) data to be collected. Analysis of the RT data confirmed the SI effect. Regions showing differential hemodynamic responses during the SI effect included the left mid section of the middle temporal gyrus, left posterior superior temporal gyrus, left anterior cingulate cortex, and bilateral orbitomedial prefrontal cortex. Additional responses were observed in the frontal eye fields, left inferior parietal lobule, and right anterior temporal and occipital cortex. The results are interpreted as indirectly supporting interactive models that allow spreading activation between both conceptual processing and phonological retrieval levels of word production. In addition, the data confirm that selective attention/response suppression has a role in resolving the SI effect similar to the way in which Stroop interference is resolved. We conclude that neuroimaging studies can provide information about the neuroanatomical organization of the lexical system that may prove useful for constraining theoretical models of word production.
Design and testing of stand-specific bucking instructions for use on modern cut-to-length harvesters
Resumo:
This study addresses three important issues in tree bucking optimization in the context of cut-to-length harvesting. (1) Would the fit between the log demand and log output distributions be better if the price and/or demand matrices controlling the bucking decisions on modern cut-to-length harvesters were adjusted to the unique conditions of each individual stand? (2) In what ways can we generate stand and product specific price and demand matrices? (3) What alternatives do we have to measure the fit between the log demand and log output distributions, and what would be an ideal goodness-of-fit measure? Three iterative search systems were developed for seeking stand-specific price and demand matrix sets: (1) A fuzzy logic control system for calibrating the price matrix of one log product for one stand at a time (the stand-level one-product approach); (2) a genetic algorithm system for adjusting the price matrices of one log product in parallel for several stands (the forest-level one-product approach); and (3) a genetic algorithm system for dividing the overall demand matrix of each of the several log products into stand-specific sub-demands simultaneously for several stands and products (the forest-level multi-product approach). The stem material used for testing the performance of the stand-specific price and demand matrices against that of the reference matrices was comprised of 9 155 Norway spruce (Picea abies (L.) Karst.) sawlog stems gathered by harvesters from 15 mature spruce-dominated stands in southern Finland. The reference price and demand matrices were either direct copies or slightly modified versions of those used by two Finnish sawmilling companies. Two types of stand-specific bucking matrices were compiled for each log product. One was from the harvester-collected stem profiles and the other was from the pre-harvest inventory data. Four goodness-of-fit measures were analyzed for their appropriateness in determining the similarity between the log demand and log output distributions: (1) the apportionment degree (index), (2) the chi-square statistic, (3) Laspeyres quantity index, and (4) the price-weighted apportionment degree. The study confirmed that any improvement in the fit between the log demand and log output distributions can only be realized at the expense of log volumes produced. Stand-level pre-control of price matrices was found to be advantageous, provided the control is done with perfect stem data. Forest-level pre-control of price matrices resulted in no improvement in the cumulative apportionment degree. Cutting stands under the control of stand-specific demand matrices yielded a better total fit between the demand and output matrices at the forest level than was obtained by cutting each stand with non-stand-specific reference matrices. The theoretical and experimental analyses suggest that none of the three alternative goodness-of-fit measures clearly outperforms the traditional apportionment degree measure. Keywords: harvesting, tree bucking optimization, simulation, fuzzy control, genetic algorithms, goodness-of-fit