881 resultados para Interaction modeling. Model-based development. Interaction evaluation.
Resumo:
Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.
Resumo:
This study highlights the variables associated with the implementation of renewable energy (RE) projects for sustainable development in India, by using an interpretive structural modeling (ISM) - based approach to model variables' interactions, which impact RE adoption. These variables have been categorized under enablers that help to enhance implementation of RE projects for sustainable development. A major finding is that public awareness regarding RE for sustainable development is a very significant enabler. For successful implementation of RE projects, it has been observed that top management should focus on improving highdriving power enablers (leadership, strategic planning, public awareness, management commitment, availability of finance, government support, and support from interest groups).
Resumo:
Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.
Resumo:
The thesis investigates the properties of two trends or time series which formed a:part of the Co-Citation bibliometric model "X~Ray Crystallography and Protein Determination in 1978, 1980 and 1982". This model was one of several created for the 1983 ABRC Science Policy Study which aimed to test the utility of bibliometric models in a national science policy context. The outcome of the validation part of that study proved to be especially favourable concerning the utility of trend data, which purport to model the development of speciality areas in science over time. This assessment could have important implications for the use of such data in policy formulation. However one possible problem with the Science Policy Study's conclusions was that insufficient time was available in the study for an in-depth analysis of the data. The thesis aims to continue the validation begun in the ABRC study by providing a detailed.examination of the characteristics of the data contained in the Trends numbered 11 and 44 in the model. A novel methodology for the analysis of the properties of the trends with respect to their literature content is presented. This is followed by an assessment based on questionnaire and interview data, of the ability of Trend 44 to realistically model the historical development of the field of mobile genetic elements research over time, with respect to its scientific content and the activities of its community of researchers. The results of these various analyses are then used to evaluate the strenghts and weaknesses of a trend or time series approach to the modelling of the activities of scientifiic fields. A critical evaluation of the origins of the discovered strengths and weaknesses.in the assumptions underlying the techniques used to generate trends from co-citation data is provided. Possible improvements. to the modelling techniques are discussed.
Resumo:
The research examines the deposition of airborne particles which contain heavy metals and investigates the methods that can be used to identify their sources. The research focuses on lead and cadmium because these two metals are of growing public and scientific concern on environmental health grounds. The research consists of three distinct parts. The first is the development and evaluation of a new deposition measurement instrument - the deposit cannister - designed specifically for large-scale surveys in urban areas. The deposit cannister is specifically designed to be cheap, robust, and versatile and therefore to permit comprehensive high-density urban surveys. The siting policy reduces contamination from locally resuspended surface-dust. The second part of the research has involved detailed surveys of heavy metal deposition in Walsall, West Midlands, using the new high-density measurement method. The main survey, conducted over a six-week period in November - December 1982, provided 30-day samples of deposition at 250 different sites. The results have been used to examine the magnitude and spatial variability of deposition rates in the case-study area, and to evaluate the performance of the measurement method. The third part of the research has been to conduct a 'source-identification' exercise. The methods used have been Receptor Models - Factor Analysis and Cluster Analysis - and a predictive source-based deposition model. The results indicate that there are six main source processes contributing to deposition of metals in the Walsall area: coal combustion, vehicle emissions, ironfounding, copper refining and two general industrial/urban processes. |A source-based deposition model has been calibrated using facctorscores for one source factor as the dependent variable, rather than metal deposition rates, thus avoiding problems traditionally encountered in calibrating models in complex multi-source areas. Empirical evidence supports the hypothesised associatlon of this factor with emissions of metals from the ironfoundry industry.
Resumo:
Prior to the development of a production standard control system for ML Aviation's plan-symmetric remotely piloted helicopter system, SPRITE, optimum solutions to technical requirements had yet to be found for some aspects of the work. This thesis describes an industrial project where solutions to real problems have been provided within strict timescale constraints. Use has been made of published material wherever appropriate, new solutions have been contributed where none existed previously. A lack of clearly defined user requirements from potential Remotely Piloted Air Vehicle (RPAV) system users is identified, A simulation package is defined to enable the RPAV designer to progress with air vehicle and control system design, development and evaluation studies and to assist the user to investigate his applications. The theoretical basis of this simulation package is developed including Co-axial Contra-rotating Twin Rotor (CCTR), six degrees of freedom motion, fuselage aerodynamics and sensor and control system models. A compatible system of equations is derived for modelling a miniature plan-symmetric helicopter. Rigorous searches revealed a lack of CCTR models, based on closed form expressions to obviate integration along the rotor blade, for stabilisation and navigation studies through simulation. An economic CCTR simulation model is developed and validated by comparison with published work and practical tests. Confusion in published work between attitude and Euler angles is clarified. The implementation of package is discussed. dynamic adjustment of assessment. the theory into a high integrity software Use is made of a novel technique basing the integration time step size on error Simulation output for control system stability verification, cross coupling of motion between control channels and air vehicle response to demands and horizontal wind gusts studies are presented. Contra-Rotating Twin Rotor Flight Control System Remotely Piloted Plan-Symmetric Helicopter Simulation Six Degrees of Freedom Motion ( i i)
Resumo:
The diffusion and convection of a solute suspended in a fluid across porous membranes are known to be reduced compared to those in a bulk solution, owing to the fluid mechanical interaction between the solute and the pore wall as well as steric restriction. If the solute and the pore wall are electrically charged, the electrostatic interaction between them could affect the hindrance to diffusion and convection. In this study, the transport of charged spherical solutes through charged circular cylindrical pores filled with an electrolyte solution containing small ions was studied numerically by using a fluid mechanical and electrostatic model. Based on a mean field theory, the electrostatic interaction energy between the solute and the pore wall was estimated from the Poisson-Boltzmann equation, and the charge effect on the solute transport was examined for the solute and pore wall of like charge. The results were compared with those obtained from the linearized form of the Poisson-Boltzmann equation, i.e.the Debye-Hückel equation. © 2012 The Japan Society of Fluid Mechanics and IOP Publishing Ltd.
Resumo:
Sentiment analysis or opinion mining aims to use automated tools to detect subjective information such as opinions, attitudes, and feelings expressed in text. This paper proposes a novel probabilistic modeling framework based on Latent Dirichlet Allocation (LDA), called joint sentiment/topic model (JST), which detects sentiment and topic simultaneously from text. Unlike other machine learning approaches to sentiment classification which often require labeled corpora for classifier training, the proposed JST model is fully unsupervised. The model has been evaluated on the movie review dataset to classify the review sentiment polarity and minimum prior information have also been explored to further improve the sentiment classification accuracy. Preliminary experiments have shown promising results achieved by JST.
Resumo:
The Teallach project has adapted model-based user-interface development techniques to the systematic creation of user-interfaces for object-oriented database applications. Model-based approaches aim to provide designers with a more principled approach to user-interface development using a variety of underlying models, and tools which manipulate these models. Here we present the results of the Teallach project, describing the tools developed and the flexible design method supported. Distinctive features of the Teallach system include provision of database-specific constructs, comprehensive facilities for relating the different models, and support for a flexible design method in which models can be constructed and related by designers in different orders and in different ways, to suit their particular design rationales. The system then creates the desired user-interface as an independent, fully functional Java application, with automatically generated help facilities.
Resumo:
Background: Coronary heart disease (CHD) is a public health priority in the UK. The National Service Framework (NSF) has set standards for the prevention, diagnosis and treatment of CHD, which include the use of cholesterol-lowering agents aimed at achieving targets of blood total cholesterol (TC) < 5.0 mmol/L and low density lipoprotein-cholesterol (LDL-C) < 3.0 mmol/L. In order to achieve these targets cost effectively, prescribers need to make an informed choice from the range of statins available. Aim: To estimate the average and relative cost effectiveness of atorvastatin, fluvastatin, pravastatin and simvastatin in achieving the NSF LDL-C and TC targets. Design: Model-based economic evaluation. Methods: An economic model was constructed to estimate the number of patients achieving the NSF targets for LDL-C and TC at each dose of statin, and to calculate the average drug cost and incremental drug cost per patient achieving the target levels. The population baseline LDL-C and TC, and drug efficacy and drug costs were taken from previously published data. Estimates of the distribution of patients receiving each dose of statin were derived from the UK national DIN-LINK database. Results: The estimated annual drug cost per 1000 patients treated with atorvastatin was £289 000, with simvastatin £315 000, with pravastatin £333 000 and with fluvastatin £167 000. The percentages of patients achieving target are 74.4%, 46.4%, 28.4% and 13.2% for atorvastatin, simvastatin, pravastatin and fluvastatin, respectively. Incremental drug cost per extra patient treated to LDL-C and TC targets compared with fluvastafin were £198 and £226 for atorvastatin, £443 and £567 for simvastatin and £1089 and £2298 for pravastatin, using 2002 drug costs. Conclusions: As a result of its superior efficacy, atorvastatin generates a favourable cost-effectiveness profile as measured by drug cost per patient treated to LDL-C and TC targets. For a given drug budget, more patients would achieve NSF LDL-C and TC targets with atorvastatin than with any of the other statins examined.
Resumo:
Our aim is to provide molecular understanding of the mechanisms underlying the (i) interaction between the two nucleotide binding domains (NBDs) and (ii) coupling between NBDs and transmembrane domains within P-glycoprotein (Pgp) during a transport cycle. To facilitate this, we have introduced a number of unique cysteine residues at surface exposed positions (E393C, S452C, I500C, N508C, and K578C) in the N-terminal NBD of Pgp, which had previously been engineered to remove endogenous cysteines. Positions of the mutations were designed using a model based on crystallographic features of prokaryotic NBDs. The single cysteine mutants were expressed in insect cells using recombinant baculovirus and the proteins purified by metal affinity chromatography by virtue of a polyhistidine tag. None of the introduced cysteine residues perturbed the function of Pgp as judged by the characteristics of drug stimulated ATP hydrolysis. The role of residues at each of the introduced sites in the catalytic cycle of Pgp was investigated by the effect of covalent conjugation with N-ethyl-maleimide (NEM). All but one mutation (K578C) was accessible to labeling with [3H]-NEM. However, perturbation of ATPase activity was only observed for the derivitized N508C isoform. The principle functional manifestation was a marked inhibition of the "basal" rate of ATP hydrolysis. Neither the extent nor potency to which a range of drugs could affect the ATPase activity were altered in the NEM conjugated N508C isoform. The results imply that the accessibility of residue 508, located in the alpha-helical subdomain of NBD1 in Pgp, is altered by the conformational changes that occur during ATP hydrolysis.
Resumo:
The notion model of development and distribution of software (MDDS) is introduced and its role for the efficiency of the software products is stressed. Two classical MDDS are presented and some attempts to adapt them to the contemporary trends in web-based software design are described. Advantages and shortcomings of the obtained models are outlined. In conclusion the desired features of a better MDDS for web-based solutions are given.
Resumo:
The purpose of the current paper is to present the developed methodology of viable model based enterprise management, which is needed for modern enterprises to survive and growth in the information age century. The approach is based on Beer’s viable system model and uses it as a basis of the information technology implementation and development. The enterprise is viewed as a cybernetic system which functioning is controlled from the same rules as for every living system.
Resumo:
* The presented work has discussed on the KDS-2003. It has corrected in compliance with remarks and requests of participants.
Resumo:
In the current paper we firstly give a short introduction on e-learning platforms and review the case of the e-class open e-learning platform being used by the Greek tertiary education sector. Our analysis includes strategic selection issues and outcomes in general and operational and adoption issues in the case of the Technological Educational Institute (TEI) of Larissa, Greece. The methodology is being based on qualitative analysis of interviews with key actors using the platform, and statistical analysis of quantitative data related to adoption and usage in the relevant populations. The author has been a key actor in all stages and describes his insights as an early adopter, diffuser and innovative user. We try to explain the issues under consideration using existing past research outcomes and we also arrive to some conclusions and points for further research.