932 resultados para GENESIS (Computer system)
Resumo:
Bibliography: p. 35.
Resumo:
Typescript.
Resumo:
Bibliography: p. 62.
Resumo:
Objective: This study (a) evaluated the reading ability of patients following stroke and their carers and the reading level and content and design characteristics of the written information provided to them, (b) explored the influence of sociodemographic and clinical characteristics on patients' reading ability, and (c) described an education package that provides well-designed information tailored to patients' and carers' informational needs. Methods: Fifty-seven patients and 12 carers were interviewed about their informational needs in an acute stroke unit. Their reading ability was assessed using the Rapid Estimate of Adult Literacy in Medicine (REALM). The written information provided to them in the acute stroke unit was analysed using the SMOG readability formula and the Suitability Assessment of Materials (SAM). Results: Thirteen (22.8%) patients and 5 (41.7%) carers had received written stroke information. The mean reading level of materials analysed was 11th grade while patients read at a mean of 7-8th grade. Most materials (89%) scored as only adequate in content and design. Patients with combined aphasia read significantly lower (4-6th grade) than other patients (p = 0.001). Conclusion: Only a small proportion of patients and carers received written materials about stroke and the readability level and content and design characteristics of most materials required improvement. Practice implications: When developing and distributing written materials about stroke, health professionals should consider the reading ability and informational needs of the recipients, and the reading level and content and design characteristics of the written materials. A computer system can be used to generate written materials tailored to the informational needs and literacy skills of patients and carers. (c) 2005 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Document classification is a supervised machine learning process, where predefined category labels are assigned to documents based on the hypothesis derived from training set of labelled documents. Documents cannot be directly interpreted by a computer system unless they have been modelled as a collection of computable features. Rogati and Yang [M. Rogati and Y. Yang, Resource selection for domain-specific cross-lingual IR, in SIGIR 2004: Proceedings of the 27th annual international conference on Research and Development in Information Retrieval, ACM Press, Sheffied: United Kingdom, pp. 154-161.] pointed out that the effectiveness of document classification system may vary in different domains. This implies that the quality of document model contributes to the effectiveness of document classification. Conventionally, model evaluation is accomplished by comparing the effectiveness scores of classifiers on model candidates. However, this kind of evaluation methods may encounter either under-fitting or over-fitting problems, because the effectiveness scores are restricted by the learning capacities of classifiers. We propose a model fitness evaluation method to determine whether a model is sufficient to distinguish positive and negative instances while still competent to provide satisfactory effectiveness with a small feature subset. Our experiments demonstrated how the fitness of models are assessed. The results of our work contribute to the researches of feature selection, dimensionality reduction and document classification.
Resumo:
The proliferation of visual display terminals (VDTs) in offices is an international phenomenon. Numerous studies have investigated the health implications which can be categorised into visual problems, symptoms of musculo-skelctal discomfort, or psychosocial effects. The psychosocial effects are broader and there is mixed evidence in this area. The inconsistent results from the studies of VDT work so far undertaken may reflect several methodological shortcomings. In an attempt to overcome these deficiencies and to broaden the model of inter-relationships a model was developed to investigate their interactions and Ihc outputs of job satisfaction, stress and ill health. The study was a two-stage, long-term investigation with measures taken before the VDTs were introduced and the same measures taken 12 months after the 'go-live' date. The research was conducted in four offices of the Department of Social Security. The data were analysed for each individual site and in addition the total data were used in a path analysis model. Significant positive relationships were found at the pre-implementation stage between the musculo-skeletal discomfort, psychosomatic ailments, visual complaints and stress. Job satisfaction was negatively related to visual complaints and musculo-skeletal discomfort. Direct paths were found for age and job level with variety found in the job and age with job satisfaction and a negative relationship with the office environment. The only job characteristic which had a direct path to stress was 'dealing with others'. Similar inter-relationships were found in the post-implementation data. However, in addition attributes of the computer system, such as screen brightness and glare, were related positively with stress and negatively with job satisfaction. The comparison of the data at the two stages found that there had been no significant changes in the users' perceptions of their job characteristics and job satisfaction but there was a small and significant reduction in the stress measure.
Resumo:
One way of describing this thesis, is to state that it attempts to explicate the context within which an application of Stafford Beer's Viable System Model (VSM) makes cybernetic sense. The thesis will attempt to explain how such a context is presently not clearly ennunciated, and why such a lack hinders communications of the model together with its consequent effective take-up by the student or practitioner. The epistemological grounding of the VSM will be described as concerning the ontology of the individuals who apply it and give witness to its application. In describing a particular grounding for the Viable System Model, I am instantiating a methodology which I call a `hermeneutics of distinction'. The final two chapters explicate such a methodology, and consider the implications for the design of a computer system. This thesis is grounded in contemporary insights into the nervous system, and research into the biology of language and cognition. Its conclusions emerge from a synthesis of the twin discourses of Stafford Beer and Humberto Maturana.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
This thesis describes an investigation of the effect of elevated temperatures upon the properties of plain concrete containing a siliceous aggregate. A complete stress-strain relationship and creep behaviour are studied. Transient effects (non-steady state) are also examined in order to simulate more realistic conditions. A temperature range of 20-700ºC is used. corresponding to the temperatures generally attained during an actual fire. In order to carry out the requisite tests, a stiff compression testing machine has been designed and built. The overall control of the test rig is provided by a logger/computer system by developing appropriate software, thus enabling the load to be held constant for any period of tlme. Before outlining any details of the development of the testing apparatus which includes an electric furnace and the.associated instrumentation, previous work on properties of both concrete and. steel at elevated temperatures is reviewed. The test programme comprises four series of tests:stress-strain tests (with and without pre-load), transient tests (heating to failure under constant stress) and creep tests (constant stress and constant temperature). Where 3 stress levels are examined: 0.2, 0.4 & 0.6 fc. The experimental results show that the properties of concrete are significantly affected by temperature and the magnitude of the load. The slope of the descending portion branch of the stress-strain curves (strain softening) is found to be temperature dependent. After normalizing the data, the stress-strain curves for different temperatures are represented by a single curve. The creep results are analysed using an approach involving the activation energy which is found to be constant. The analysis shows that the time-dependent deformation is sensibly linear with the applied stress. The total strain concept is shown to hold for the test data within limits.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
This dissertation investigates the very important and current problem of modelling human expertise. This is an apparent issue in any computer system emulating human decision making. It is prominent in Clinical Decision Support Systems (CDSS) due to the complexity of the induction process and the vast number of parameters in most cases. Other issues such as human error and missing or incomplete data present further challenges. In this thesis, the Galatean Risk Screening Tool (GRiST) is used as an example of modelling clinical expertise and parameter elicitation. The tool is a mental health clinical record management system with a top layer of decision support capabilities. It is currently being deployed by several NHS mental health trusts across the UK. The aim of the research is to investigate the problem of parameter elicitation by inducing them from real clinical data rather than from the human experts who provided the decision model. The induced parameters provide an insight into both the data relationships and how experts make decisions themselves. The outcomes help further understand human decision making and, in particular, help GRiST provide more accurate emulations of risk judgements. Although the algorithms and methods presented in this dissertation are applied to GRiST, they can be adopted for other human knowledge engineering domains.
Resumo:
The paper describes the architecture of SCIT - supercomputer system of cluster type and the base architecture features used during this research project. This supercomputer system is put into research operation in Glushkov Institute of Cybernetics NAS of Ukraine from the early 2006 year. The paper may be useful for those scientists and engineers that are practically engaged in a cluster supercomputer systems design, integration and services.
Resumo:
The paper describes cluster management software and hardware of SCIT supercomputer clusters built in Glushkov Institute of Cybernetics NAS of Ukraine. The paper shows the performance results received on systems that were built and the specific means used to fulfil the goal of performance increase. It should be useful for those scientists and engineers that are practically engaged in a cluster supercomputer systems design, integration and services.
Resumo:
* This paper was made according to the program of fundamental scientific research of the Presidium of the Russian Academy of Sciences «Mathematical simulation and intellectual systems», the project "Theoretical foundation of the intellectual systems based on ontologies for intellectual support of scientific researches".
Resumo:
We propose a method for detecting and analyzing the so-called replay attacks in intrusion detection systems, when an intruder contributes a small amount of hostile actions to a recorded session of a legitimate user or process, and replays this session back to the system. The proposed approach can be applied if an automata-based model is used to describe behavior of active entities in a computer system.