832 resultados para TWT (Electronic computer system)
Resumo:
One way of describing this thesis, is to state that it attempts to explicate the context within which an application of Stafford Beer's Viable System Model (VSM) makes cybernetic sense. The thesis will attempt to explain how such a context is presently not clearly ennunciated, and why such a lack hinders communications of the model together with its consequent effective take-up by the student or practitioner. The epistemological grounding of the VSM will be described as concerning the ontology of the individuals who apply it and give witness to its application. In describing a particular grounding for the Viable System Model, I am instantiating a methodology which I call a `hermeneutics of distinction'. The final two chapters explicate such a methodology, and consider the implications for the design of a computer system. This thesis is grounded in contemporary insights into the nervous system, and research into the biology of language and cognition. Its conclusions emerge from a synthesis of the twin discourses of Stafford Beer and Humberto Maturana.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
This thesis describes an investigation of the effect of elevated temperatures upon the properties of plain concrete containing a siliceous aggregate. A complete stress-strain relationship and creep behaviour are studied. Transient effects (non-steady state) are also examined in order to simulate more realistic conditions. A temperature range of 20-700ºC is used. corresponding to the temperatures generally attained during an actual fire. In order to carry out the requisite tests, a stiff compression testing machine has been designed and built. The overall control of the test rig is provided by a logger/computer system by developing appropriate software, thus enabling the load to be held constant for any period of tlme. Before outlining any details of the development of the testing apparatus which includes an electric furnace and the.associated instrumentation, previous work on properties of both concrete and. steel at elevated temperatures is reviewed. The test programme comprises four series of tests:stress-strain tests (with and without pre-load), transient tests (heating to failure under constant stress) and creep tests (constant stress and constant temperature). Where 3 stress levels are examined: 0.2, 0.4 & 0.6 fc. The experimental results show that the properties of concrete are significantly affected by temperature and the magnitude of the load. The slope of the descending portion branch of the stress-strain curves (strain softening) is found to be temperature dependent. After normalizing the data, the stress-strain curves for different temperatures are represented by a single curve. The creep results are analysed using an approach involving the activation energy which is found to be constant. The analysis shows that the time-dependent deformation is sensibly linear with the applied stress. The total strain concept is shown to hold for the test data within limits.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
This dissertation investigates the very important and current problem of modelling human expertise. This is an apparent issue in any computer system emulating human decision making. It is prominent in Clinical Decision Support Systems (CDSS) due to the complexity of the induction process and the vast number of parameters in most cases. Other issues such as human error and missing or incomplete data present further challenges. In this thesis, the Galatean Risk Screening Tool (GRiST) is used as an example of modelling clinical expertise and parameter elicitation. The tool is a mental health clinical record management system with a top layer of decision support capabilities. It is currently being deployed by several NHS mental health trusts across the UK. The aim of the research is to investigate the problem of parameter elicitation by inducing them from real clinical data rather than from the human experts who provided the decision model. The induced parameters provide an insight into both the data relationships and how experts make decisions themselves. The outcomes help further understand human decision making and, in particular, help GRiST provide more accurate emulations of risk judgements. Although the algorithms and methods presented in this dissertation are applied to GRiST, they can be adopted for other human knowledge engineering domains.
Resumo:
The paper describes the architecture of SCIT - supercomputer system of cluster type and the base architecture features used during this research project. This supercomputer system is put into research operation in Glushkov Institute of Cybernetics NAS of Ukraine from the early 2006 year. The paper may be useful for those scientists and engineers that are practically engaged in a cluster supercomputer systems design, integration and services.
Resumo:
The paper describes cluster management software and hardware of SCIT supercomputer clusters built in Glushkov Institute of Cybernetics NAS of Ukraine. The paper shows the performance results received on systems that were built and the specific means used to fulfil the goal of performance increase. It should be useful for those scientists and engineers that are practically engaged in a cluster supercomputer systems design, integration and services.
Resumo:
* This paper was made according to the program of fundamental scientific research of the Presidium of the Russian Academy of Sciences «Mathematical simulation and intellectual systems», the project "Theoretical foundation of the intellectual systems based on ontologies for intellectual support of scientific researches".
Resumo:
We propose a method for detecting and analyzing the so-called replay attacks in intrusion detection systems, when an intruder contributes a small amount of hostile actions to a recorded session of a legitimate user or process, and replays this session back to the system. The proposed approach can be applied if an automata-based model is used to describe behavior of active entities in a computer system.
Resumo:
Objective The aim of this study was to provide an initial insight into current UK paediatric prescribing practice. Methods In 2012 focus groups were conducted at Birmingham Children's Hospital (UK specialist hospital) with both medical and non-medical prescribers and analysed using thematic analysis. Key findings Both sets of prescribers used a wide range of resources to support their prescribing decisions. Dosing information was most commonly checked, and a lack of specialist paediatric information was reported in existing resources. All groups had high expectations of the support functions that should be included in an electronic prescribing system and could see many potential benefits. Participants agreed that all staff should see the same drug alerts. The overwhelming concern was whether the current information technology infrastructure would support electronic prescribing. Conclusions Prescribers had high expectations of electronic prescribing, but lacked confidence in its delivery. Prescribers use a wide range of resources to support their decision making when prescribing in paediatrics.
Resumo:
Михаил Константинов, Весела Пашева, Петко Петков - Разгледани са някои числени проблеми при използването на компютърната система MATLAB в учебната дейност: пресмятане на тригонометрични функции, повдигане на матрица на степен, спектрален анализ на целочислени матрици от нисък ред и пресмятане на корените на алгебрични уравнения. Причините за възникналите числени трудности могат да се обяснят с особеностите на използваната двоичната аритметика с плаваща точка.
Resumo:
In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.
Resumo:
In this work it was developed mathematical resolutions taking as parameter maximum intensity values for the interference analysis of electric and magnetic fields and was given two virtual computer system that supports families of CDMA and WCDMA technologies. The first family were developed computational resources to solve electric and magnetic field calculations and power densities in Radio Base stations , with the use of CDMA technology in the 800 MHz band , taking into account the permissible values referenced by the Commission International Protection on non-Ionizing Radiation . The first family is divided into two segments of calculation carried out in virtual operation. In the first segment to compute the interference field radiated by the base station with input information such as radio channel power; Gain antenna; Radio channel number; Operating frequency; Losses in the cable; Attenuation of direction; Minimum Distance; Reflections. Said computing system allows to quickly and without the need of implementing instruments for measurements, meet the following calculated values: Effective Radiated Power; Sector Power Density; Electric field in the sector; Magnetic field in the sector; Magnetic flux density; point of maximum permissible exposure of electric field and power density. The results are shown in charts for clarity of view of power density in the industry, as well as the coverage area definition. The computer module also includes folders specifications antennas, cables and towers used in cellular telephony, the following manufacturers: RFS World, Andrew, Karthein and BRASILSAT. Many are presented "links" network access "Internet" to supplement the cable specifications, antennas, etc. . In the second segment of the first family work with more variables , seeking to perform calculations quickly and safely assisting in obtaining results of radio signal loss produced by ERB . This module displays screens representing propagation systems denominated "A" and "B". By propagating "A" are obtained radio signal attenuation calculations in areas of urban models , dense urban , suburban , and rural open . In reflection calculations are present the reflection coefficients , the standing wave ratio , return loss , the reflected power ratio , as well as the loss of the signal by mismatch impedance. With the spread " B" seek radio signal losses in the survey line and not targeted , the effective area , the power density , the received power , the coverage radius , the conversion levels and the gain conversion systems radiant . The second family of virtual computing system consists of 7 modules of which 5 are geared towards the design of WCDMA and 2 technology for calculation of telephone traffic serving CDMA and WCDMA . It includes a portfolio of radiant systems used on the site. In the virtual operation of the module 1 is compute-: distance frequency reuse, channel capacity with noise and without noise, Doppler frequency, modulation rate and channel efficiency; Module 2 includes computes the cell area, thermal noise, noise power (dB), noise figure, signal to noise ratio, bit of power (dBm); with the module 3 reaches the calculation: breakpoint, processing gain (dB) loss in the space of BTS, noise power (w), chip period and frequency reuse factor. Module 4 scales effective radiated power, sectorization gain, voice activity and load effect. The module 5 performs the calculation processing gain (Hz / bps) bit time, bit energy (Ws). Module 6 deals with the telephone traffic and scales 1: traffic volume, occupancy intensity, average time of occupancy, traffic intensity, calls completed, congestion. Module 7 deals with two telephone traffic and allows calculating call completion and not completed in HMM. Tests were performed on the mobile network performance field for the calculation of data relating to: CINP , CPI , RSRP , RSRQ , EARFCN , Drop Call , Block Call , Pilot , Data Bler , RSCP , Short Call, Long Call and Data Call ; ECIO - Short Call and Long Call , Data Call Troughput . As survey were conducted surveys of electric and magnetic field in an ERB , trying to observe the degree of exposure to non-ionizing radiation they are exposed to the general public and occupational element. The results were compared to permissible values for health endorsed by the ICNIRP and the CENELEC .
Resumo:
Este trabalho tem por objetivo relatar os resultados preliminares de uma experiência de integração curricular, em andamento na Escola de Informática da UCPel, na área de fundamentos matemáticos da Ciência da Computação. A concepção curricular da experiência está baseada nas idéias de Basil Bernstein sobre currículos de coleção e/ou integração, na idéia de desenvolvimento autônomo do aluno e na organização do ensino em forma semi-presencial (com apoio da Internet) e cooperativa (com apoio de softwares matemáticos).
Resumo:
As the amount of material on the World Wide Web continues to grow, users are discovering that the Web's embedded, hard-coded, links are difficult to maintain and update. Hyperlinks need a degree of abstraction in the way they are specified together with a sound underlying document structure and the property of separability from the documents they are linking. The case is made by studying the advantages of program/data separation in computer system architectures and also by re-examining some selected hypermedia systems that have already implemented separability. The prospects for introducing more abstract links into future versions of HTML and PDF, via emerging standards such as XPath, XPointer XLink and URN, are briefly discussed.