960 resultados para IBM PC
Resumo:
No. 467- accompanied by 5 1/4 in. computer disk.
Resumo:
Operators can become confused while diagnosing faults in process plant while in operation. This may prevent remedial actions being taken before hazardous consequences can occur. The work in this thesis proposes a method to aid plant operators in systematically finding the causes of any fault in the process plant. A computer aided fault diagnosis package has been developed for use on the widely available IBM PC compatible microcomputer. The program displays a coloured diagram of a fault tree on the VDU of the microcomputer, so that the operator can see the link between the fault and its causes. The consequences of the fault and the causes of the fault are also shown to provide a warning of what may happen if the fault is not remedied. The cause and effect data needed by the package are obtained from a hazard and operability (HAZOP) study on the process plant. The result of the HAZOP study is recorded as cause and symptom equations which are translated into a data structure and stored in the computer as a file for the package to access. Probability values are assigned to the events that constitute the basic causes of any deviation. From these probability values, the a priori probabilities of occurrence of other events are evaluated. A top-down recursive algorithm, called TDRA, for evaluating the probability of every event in a fault tree has been developed. From the a priori probabilities, the conditional probabilities of the causes of the fault are then evaluated using Bayes' conditional probability theorem. The posteriori probability values could then be used by the operators to check in an orderly manner the cause of the fault. The package has been tested using the results of a HAZOP study on a pilot distillation plant. The results from the test show how easy it is to trace the chain of events that leads to the primary cause of a fault. This method could be applied in a real process environment.
Resumo:
This thesis describes the design and implementation of an interactive dynamic simulator called DASPRII. The starting point of this research has been an existing dynamic simulation package, DASP. DASPII is written in standard FORTRAN 77 and is implemented on universally available IBM-PC or compatible machines. It provides a means for the analysis and design of chemical processes. Industrial interest in dynamic simulation has increased due to the recent increase in concern over plant operability, resiliency and safety. DASPII is an equation oriented simulation package which allows solution of dynamic and steady state equations. The steady state can be used to initialise the dynamic simulation. A robust non linear algebraic equation solver has been implemented for steady state solution. This has increased the general robustness of DASPII, compared to DASP. A graphical front end is used to generate the process flowsheet topology from a user constructed diagram of the process. A conversational interface is used to interrogate the user with the aid of a database, to complete the topological information. An original modelling strategy implemented in DASPII provides a simple mechanism for parameter switching which creates a more flexible simulation environment. The problem description generated is by a further conversational procedure using a data-base. The model format used allows the same model equations to be used for dynamic and steady state solution. All the useful features of DASPI are retained in DASPII. The program has been demonstrated and verified using a number of example problems, Significant improvements using the new NLAE solver have been shown. Topics requiring further research are described. The benefits of variable switching in models has been demonstrated with a literature problem.
Resumo:
A method has been constructed for the solution of a wide range of chemical plant simulation models including differential equations and optimization. Double orthogonal collocation on finite elements is applied to convert the model into an NLP problem that is solved either by the VF 13AD package based on successive quadratic programming, or by the GRG2 package, based on the generalized reduced gradient method. This approach is termed simultaneous optimization and solution strategy. The objective functional can contain integral terms. The state and control variables can have time delays. Equalities and inequalities containing state and control variables can be included into the model as well as algebraic equations and inequalities. The maximum number of independent variables is 2. Problems containing 3 independent variables can be transformed into problems having 2 independent variables using finite differencing. The maximum number of NLP variables and constraints is 1500. The method is also suitable for solving ordinary and partial differential equations. The state functions are approximated by a linear combination of Lagrange interpolation polynomials. The control function can either be approximated by a linear combination of Lagrange interpolation polynomials or by a piecewise constant function over finite elements. The number of internal collocation points can vary by finite elements. The residual error is evaluated at arbitrarily chosen equidistant grid-points, thus enabling the user to check the accuracy of the solution between collocation points, where the solution is exact. The solution functions can be tabulated. There is an option to use control vector parameterization to solve optimization problems containing initial value ordinary differential equations. When there are many differential equations or the upper integration limit should be selected optimally then this approach should be used. The portability of the package has been addressed converting the package from V AX FORTRAN 77 into IBM PC FORTRAN 77 and into SUN SPARC 2000 FORTRAN 77. Computer runs have shown that the method can reproduce optimization problems published in the literature. The GRG2 and the VF I 3AD packages, integrated into the optimization package, proved to be robust and reliable. The package contains an executive module, a module performing control vector parameterization and 2 nonlinear problem solver modules, GRG2 and VF I 3AD. There is a stand-alone module that converts the differential-algebraic optimization problem into a nonlinear programming problem.
Resumo:
In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.
Resumo:
Image processing offers unparalleled potential for traffic monitoring and control. For many years engineers have attempted to perfect the art of automatic data abstraction from sequences of video images. This paper outlines a research project undertaken at Napier University by the authors in the field of image processing for automatic traffic analysis. A software based system implementing TRIP algorithms to count cars and measure vehicle speed has been developed by members of the Transport Engineering Research Unit (TERU) at the University. The TRIP algorithm has been ported and evaluated on an IBM PC platform with a view to hardware implementation of the pre-processing routines required for vehicle detection. Results show that a software based traffic counting system is realisable for single window processing. Due to the high volume of data required to be processed for full frames or multiple lanes, system operations in real time are limited. Therefore specific hardware is required to be designed. The paper outlines a hardware design for implementation of inter-frame and background differencing, background updating and shadow removal techniques. Preliminary results showing the processing time and counting accuracy for the routines implemented in software are presented and a real time hardware pre-processing architecture is described.
Resumo:
Parallel combinatory orthogonal frequency division multiplexing (PC-OFDM yields lower maximum peak-to-average power ratio (PAR), high bandwidth efficiency and lower bit error rate (BER) on Gaussian channels compared to OFDM systems. However, PC-OFDM does not improve the statistics of PAR significantly. In this chapter, the use of a set of fixed permutations to improve the statistics of the PAR of a PC-OFDM signal is presented. For this technique, interleavers are used to produce K-1 permuted sequences from the same information sequence. The sequence with the lowest PAR, among K sequences is chosen for the transmission. The PAR of a PC-OFDM signal can be further reduced by 3-4 dB by this technique. Mathematical expressions for the complementary cumulative density function (CCDF)of PAR of PC-OFDM signal and interleaved PC-OFDM signal are also presented.
Resumo:
In just under 3 months worldwide sales of Apple's iPad tablet device stood at over 3 million units sold. The iPad device, along with rival products signify a shift in the way in which print and other media products are purchased and consumed by users. While facing initial skepticism about the uptake of the device numerous industries have been quick to adapt the device to their specific needs. Based around a newly developed six point typology of “post PC” device utility this project undertook a significant review of publicly available material to identify worldwide trends in iPad adoption and use within the tertiary sector.
Resumo:
We have developed digital image registration program for a MC 68000 based fundus image processing system (FIPS). FIPS not only is capable of executing typical image processing algorithms in spatial as well as Fourier domain, the execution time for many operations has been made much quicker by using a hybrid of "C", Fortran and MC6000 assembly languages.
Resumo:
Needs assessment strategies can facilitate prioritisation of resources. To develop a needs assessment tool for use with advanced cancer patients and caregivers, to prompt early intervation. A convenience sample of 103 health professionals viewed three videotaped consultations involving a simulated patient, his/her caregiver and a health professional, completed the Palliative Care Needs Assessment Tool (PC-NAT) and provided feedback on clarity, content and acceptability of the PC-NAT. Face and content validity, acceptability and feasibility of the PC-NAT were confirmed. Kappa scores indicated adequate inter-rater reliability for the majority of domains; the patient spirituality domain and the caregiver physical and family and relationship domains had low reliability. The PC-NAT can be used by health professionals with a range of clinical expertise to identify individuals' needs, thereby enabling early intervention. Further psychometric testing and an evaluation to assess the impact of the systematic use of the PC-NAT on quality of life, unmet needs and service utilisation of patients and caregivers are underway.
Resumo:
Background Quality of life (QOL) measures are an important patient-relevant outcome measure for clinical studies. Currently there is no fully validated cough-specific QOL measure for paediatrics. The objective of this study was to validate a cough-specific QOL questionnaire for paediatric use. Method 43 children (28 males, 15 females; median age 29 months, IQR 20–41 months) newly referred for chronic cough participated. One parent of each child completed the 27-item Parent Cough-Specific QOL questionnaire (PC-QOL), and the generic child (Pediatric QOL Inventory 4.0 (PedsQL)) and parent QOL questionnaires (SF-12) and two cough-related measures (visual analogue score and verbal category descriptive score) on two occasions separated by 2–3 weeks. Cough counts were also objectively measured on both occasions. Results Internal consistency for both the domains and total PC-QOL at both test times was excellent (Cronbach alpha range 0.70–0.97). Evidence for repeatability and criterion validity was established, with significant correlations over time and significant relationships with the cough measures. The PC-QOL was sensitive to change across the test times and these changes were significantly related to changes in cough measures (PC-QOL with: verbal category descriptive score, rs=−0.37, p=0.016; visual analogue score, rs=−0.47, p=0.003). Significant correlations of the difference scores for the social domain of the PC-QOL and the domain and total scores of the PedsQL were also noted (rs=0.46, p=0.034). Conclusion The PC-QOL is a reliable and valid outcome measure that assesses QOL related to childhood cough at a given time point and measures changes in cough-specific QOL over time.
Resumo:
Objective To assess the usability and validity of the Primary Care Practice Improvement Tool (PC-PIT), a practice performance improvement tool based on 13 key elements identified by a systematic review. It was co-created with a range of partners and designed specifically for primary health care. Design This pilot study examined the PC-PIT using a formative assessment framework and mixed-methods research design. Setting and participants Six high-functioning general practices in Queensland, Australia, between February and July 2013. A total of 28 staff participated — 10 general practitioners, six practice or community nurses, 12 administrators (four practice managers; one business manager and eight reception or general administrative staff). Main outcome measures Readability, content validity and staff perceptions of the PC-PIT. Results The PC-PIT offers an appropriate and acceptable approach to internal quality improvement in general practice. Quantitative assessment scores and qualitative data from all staff identified two areas in which the PC-PIT required modification: a reduction in the indicative reading age, and simplification of governance-related terms and concepts. Conclusion The PC-PIT provides an innovative approach to address the complexity of organisational improvement in general practice and primary health care. This initial validation will be used to develop a suite of supporting, high-quality and free-to-access resources to enhance the use of the PC-PIT in general practice. Based on these findings, a national trial is now underway.