929 resultados para Personal uses of computer
Resumo:
Thesis (M. S.)--University of Illinois at Urbana-Champaign.
Resumo:
Includes bibliographical references.
Resumo:
Bibliography: p. 39 (2d group)
Resumo:
Includes bibliographical references.
Resumo:
"December 12, 1955"
Resumo:
Vita.
Resumo:
Objectives: To validate the WOMAC 3.1 in a touch screen computer format, which applies each question as a cartoon in writing and in speech (QUALITOUCH method), and to assess patient acceptance of the computer touch screen version. Methods: The paper and computer formats of WOMAC 3.1 were applied in random order to 53 subjects with hip or knee osteoarthritis. The mean age of the subjects was 64 years ( range 45 to 83), 60% were male, 53% were 65 years or older, and 53% used computers at home or at work. Agreement between formats was assessed by intraclass correlation coefficients (ICCs). Preferences were assessed with a supplementary questionnaire. Results: ICCs between formats were 0.92 (95% confidence interval, 0.87 to 0.96) for pain; 0.94 (0.90 to 0.97) for stiffness, and 0.96 ( 0.94 to 0.98) for function. ICCs were similar in men and women, in subjects with or without previous computer experience, and in subjects below or above age 65. The computer format was found easier to use by 26% of the subjects, the paper format by 8%, and 66% were undecided. Overall, 53% of subjects preferred the computer format, while 9% preferred the paper format, and 38% were undecided. Conclusion: The computer format of the WOMAC 3.1 is a reliable assessment tool. Agreement between computer and paper formats was independent of computer experience, age, or sex. Thus the computer format may help improve patient follow up by meeting patients' preferences and providing immediate results.
Resumo:
Denial is a commonly used strategy to rebut a false rumor. However, there is a dearth of empirical research on the effectiveness of denials in combating rumors. Treating denials as persuasive messages, we conducted 3 laboratory-based simulation studies testing the overall effectiveness of denials in reducing belief and anxiety associated with an e-mail virus rumor. Under the framework of the elaboration likelihood model, we also tested the effects of denial message quality and source credibility, and the moderating effects of personal relevance. Overall, the results provided some support for the effectiveness of denials with strong arguments and an anxiety-alleviating tone in reducing rumor-related belief and anxiety. The effects of denial wording and source credibility were visible for participants who perceived high personal relevance of the topic. Limitations of the current research and future research directions are discussed.
Resumo:
Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.
Resumo:
The thesis describes an investigation into methods for the specification, design and implementation of computer control systems for flexible manufacturing machines comprising multiple, independent, electromechanically-driven mechanisms. An analysis is made of the elements of conventional mechanically-coupled machines in order that the operational functions of these elements may be identified. This analysis is used to define the scope of requirements necessary to specify the format, function and operation of a flexible, independently driven mechanism machine. A discussion of how this type of machine can accommodate modern manufacturing needs of high-speed and flexibility is presented. A sequential method of capturing requirements for such machines is detailed based on a hierarchical partitioning of machine requirements from product to independent drive mechanism. A classification of mechanisms using notations, including Data flow diagrams and Petri-nets, is described which supports capture and allows validation of requirements. A generic design for a modular, IDM machine controller is derived based upon hierarchy of control identified in these machines. A two mechanism experimental machine is detailed which is used to demonstrate the application of the specification, design and implementation techniques. A computer controller prototype and a fully flexible implementation for the IDM machine, based on Petri-net models described using the concurrent programming language Occam, is detailed. The ability of this modular computer controller to support flexible, safe and fault-tolerant operation of the two intermittent motion, discrete-synchronisation independent drive mechanisms is presented. The application of the machine development methodology to industrial projects is established.
Resumo:
This preliminary report describes work carried out as part of work package 1.2 of the MUCM research project. The report is split in two parts: the ?rst part (Sections 1 and 2) summarises the state of the art in emulation of computer models, while the second presents some initial work on the emulation of dynamic models. In the ?rst part, we describe the basics of emulation, introduce the notation and put together the key results for the emulation of models with single and multiple outputs, with or without the use of mean function. In the second part, we present preliminary results on the chaotic Lorenz 63 model. We look at emulation of a single time step, and repeated application of the emulator for sequential predic- tion. After some design considerations, the emulator is compared with the exact simulator on a number of runs to assess its performance. Several general issues related to emulating dynamic models are raised and discussed. Current work on the larger Lorenz 96 model (40 variables) is presented in the context of dimension reduction, with results to be provided in a follow-up report. The notation used in this report are summarised in appendix.
Resumo:
A recent method for phase equilibria, the AGAPE method, has been used to predict activity coefficients and excess Gibbs energy for binary mixtures with good accuracy. The theory, based on a generalised London potential (GLP), accounts for intermolecular attractive forces. Unlike existing prediction methods, for example UNIFAC, the AGAPE method uses only information derived from accessible experimental data and molecular information for pure components. Presently, the AGAPE method has some limitations, namely that the mixtures must consist of small, non-polar compounds with no hydrogen bonding, at low moderate pressures and at conditions below the critical conditions of the components. Distinction between vapour-liquid equilibria and gas-liquid solubility is rather arbitrary and it seems reasonable to extend these ideas to solubility. The AGAPE model uses a molecular lattice-based mixing rule. By judicious use of computer programs a methodology was created to examine a body of experimental gas-liquid solubility data for gases such as carbon dioxide, propane, n-butane or sulphur hexafluoride which all have critical temperatures a little above 298 K dissolved in benzene, cyclo-hexane and methanol. Within this methodology the value of the GLP as an ab initio combining rule for such solutes in very dilute solutions in a variety of liquids has been tested. Using the GLP as a mixing rule involves the computation of rotationally averaged interactions between the constituent atoms, and new calculations have had to be made to discover the magnitude of the unlike pair interactions. These numbers have been seen as significant in their own right in the context of the behaviour of infinitely-dilute solutions. A method for extending this treatment to "permanent" gases has also been developed. The findings from the GLP method and from the more general AGAPE approach have been examined in the context of other models for gas-liquid solubility, both "classical" and contemporary, in particular those derived from equations-of-state methods and from reference solvent methods.
Resumo:
Collaborative working with the aid of computers is increasing rapidly due to the widespread use of computer networks, geographic mobility of people, and small powerful personal computers. For the past ten years research has been conducted into this use of computing technology from a wide variety of perspectives and for a wide range of uses. This thesis adds to that previous work by examining the area of collaborative writing amongst groups of people. The research brings together a number of disciplines, namely sociology for examining group dynamics, psychology for understanding individual writing and learning processes, and computer science for database, networking, and programming theory. The project initially looks at groups and how they form, communicate, and work together, progressing on to look at writing and the cognitive processes it entails for both composition and retrieval. The thesis then details a set of issues which need to be addressed in a collaborative writing system. These issues are then followed by developing a model for collaborative writing, detailing an iterative process of co-ordination, writing and annotation, consolidation, and negotiation, based on a structured but extensible document model. Implementation issues for a collaborative application are then described, along with various methods of overcoming them. Finally the design and implementation of a collaborative writing system, named Collaborwriter, is described in detail, which concludes with some preliminary results from initial user trials and testing.
Resumo:
The majority of the literature about CBM is American in origin, and (inter alia) notes that there were differing uses of similar technology, indicating that context has an important role to play in the use of CBM. The literature maps the psychological effects of CBM in considerable detail, but only two published studies examine the context of CBM. These grounded results provide scant support for any systematic, quantitative, large scale analysis of computer based monitoring in the UK context. This thesis thus aims to systemically examine the context of CBM using discourse analysis. Forty four interviewees were theoretically sampled using a structured sample technique in four organizations. All were national or multinational enterprises. The interviews were semi structured in nature and divided into three sections. The first addressed the respondents' thoughts and perceptions about CBM, the second elicited talk about the departmental context (focusing the management - worker relationship), and the final section addressed the organizational context. The cases demonstrated variation in the use of CBM, measured according to the criteria of Westin (1987, 1988) and according to the interpretive repertoires used by the respondents in each case. Seven analytical categories of talk emerged from the data: three at the organizational level and four at the departmental level of analysis. Discourse analysis revealed two discrete interpretive repertories - the procedural and the substantive repertoires - in respondents' talk whose main variation occurred at the departmental level of analysis. Furthermore, patterns were found in the use of these repertories within cases and between categories. Between the cases, variation in the use of the repertories matched the between case variation according to the criteria of Westin. It would thus appear that the source of variation in the use of CBM lies in its context, more specifically in the relative emphasis of humanistic, interpersonal and idiosyncratic values within the management worker relationship.
Resumo:
As systems for computer-aided-design and production of mechanical parts have developed, there has arisen a need for techniques for the comprehensive description of the desired part, including its 3-D shape. The creation and manipulation of shapes is generally known as geometric modelling. It is desirable that links be established between geometric modellers and machining programs. Currently, unbounded APT and some bounded geometry systems are being widely used in manufacturing industry for machining operations such as: milling, drilling, boring and turning, applied mainly to engineering parts. APT systems, however, are presently only linked to wire-frame drafting systems. The combination of a geometric modeller and APT will provide a powerful manufacturing system for industry from the initial design right through part manufacture using NC machines. This thesis describes a recently developed interface (ROMAPT) between a bounded geometry modeller (ROMULUS) and an unbounded NC processor (APT). A new set of theoretical functions and practical algorithms for the computer aided manufacturing of 3D solid geometric model has been investigated. This work has led to the development of a sophisticated computer program, ROMAPT, which provides a new link between CAD (in form of a goemetric modeller ROMULUS) and CAM (in form of the APT NC system). ROMAPT has been used to machine some engineering prototypes successfully both in soft foam material and aluminium. It has been demonstrated above that the theory and algorithms developed by the author for the development of computer aided manufacturing of 3D solid modelling are both valid and applicable. ROMAPT allows the full potential of a solid geometric modeller (ROMULUS) to be further exploited for NC applications without requiring major investment in new NC processor. ROMAPT supports output in APT-AC, APT4 and the CAM-I SSRI NC languages.