45 resultados para Secure operating system

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Communication and portability are the two main problems facing the user. An operating system, called PORTOS, was developed to solve these problems for users on dedicated microcomputer systems. Firstly, an interface language was defined, according to the anticipated requirements and behaviour of its potential users. Secondly, the PORTOS operating system was developed as a processor for this language. The system is currently running on two minicomputers of highly different architectures. PORTOS achieves its portability through its high-level design, and implementation in CORAL66. The interface language consists of a set of user cotnmands and system responses. Although only a subset has been implemented, owing to time and manpower constraints, promising results were achieved regarding the usability of the language, and its portability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Emerging markets have recently been experiencing a dramatic increased in the number of mobile phone per capita. M-government has, hence, been heralded as an opportunity to leap-frog the technology cycle and provide cheaper and more inclusive and services to all. This chapter explores, within an emerging market context, the legitimacy and resistance facing civil servants’ at the engagement stage with m-government activities and the direct implication for resource management. Thirty in depth interview, in Turkey, are drawn-upon with key ICT civil servant in local organizations. The findings show that three types of resources are perceived as central namely: (i) diffusion of information management, (ii) operating system resource management and (iii) human resource management. The main evidence suggests that legitimacy for each resource management, at local level, is an ongoing struggle where all groups deploy multiples forms of resistance. Overall, greater attention in the resource management strategy for m-government application needs to be devoted to enablers such as civil servants rather than the final consumers or citizens.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The computer systems of today are characterised by data and program control that are distributed functionally and geographically across a network. A major issue of concern in this environment is the operating system activity of resource management for different processors in the network. To ensure equity in load distribution and improved system performance, load balancing is often undertaken. The research conducted in this field so far, has been primarily concerned with a small set of algorithms operating on tightly-coupled distributed systems. More recent studies have investigated the performance of such algorithms in loosely-coupled architectures but using a small set of processors. This thesis describes a simulation model developed to study the behaviour and general performance characteristics of a range of dynamic load balancing algorithms. Further, the scalability of these algorithms are discussed and a range of regionalised load balancing algorithms developed. In particular, we examine the impact of network diameter and delay on the performance of such algorithms across a range of system workloads. The results produced seem to suggest that the performance of simple dynamic policies are scalable but lack the load stability of more complex global average algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A small lathe has been modified to work under microprocessor control to enhance the facilities which the lathe offers and provide a wider operating range with relevant economic gains. The result of these modifications give better operating system characteristics. A system of electronic circuits have been developed, utilising the latest technology, to replace the pegboard with the associated obsolete electrical components. Software for the system includes control programmes for the implementation of the original pegboard operation and several sample machine code programmes are included, covering a wide spectrum of applications, including diagnostic testing of the control system. It is concluded that it is possible to carry out a low cost retrofit on existing machine tools to enhance their range of capabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To exploit the popularity of TCP as still the dominant sender and protocol of choice for transporting data reliably across the heterogeneous Internet, this thesis explores end-to-end performance issues and behaviours of TCP senders when transferring data to wireless end-users. The theme throughout is on end-users located specifically within 802.11 WLANs at the edges of the Internet, a largely untapped area of work. To exploit the interests of researchers wanting to study the performance of TCP accurately over heterogeneous conditions, this thesis proposes a flexible wired-to-wireless experimental testbed that better reflects conditions in the real-world. To exploit the transparent functionalities between TCP in the wired domain and the IEEE 802.11 WLAN protocols, this thesis proposes a more accurate methodology for gauging the transmission and error characteristics of real-world 802.11 WLANs. It also aims to correlate any findings with the functionality of fixed TCP senders. To exploit the popularity of Linux as a popular operating system for many of the Internet’s data servers, this thesis studies and evaluates various sender-side TCP congestion control implementations within the recent Linux v2.6. A selection of the implementations are put under systematic testing using real-world wired-to-wireless conditions in order to screen and present a viable candidate/s for further development and usage in the modern-day heterogeneous Internet. Overall, this thesis comprises a set of systematic evaluations of TCP senders over 802.11 WLANs, incorporating measurements in the form of simulations, emulations, and through the use of a real-world-like experimental testbed. The goal of the work is to ensure that all aspects concerned are comprehensively investigated in order to establish rules that can help to decide under which circumstances the deployment of TCP is optimal i.e. a set of paradigms for advancing the state-of-the-art in data transport across the Internet.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A view has emerged within manufacturing and service organizations that the operations management function can hold the key to achieving competitive edge. This has recently been emphasized by the demands for greater variety and higher quality which must be set against a background of increasing cost of resources. As nations' trade barriers are progressively lowered and removed, so producers of goods and service products are becoming more exposed to competition that may come from virtually anywhere around the world. To simply survive in this climate many organizations have found it necessary to improve their manufacturing or service delivery systems. To become real ''winners'' some have adopted a strategic approach to operations and completely reviewed and restructured their approach to production system design and operations planning and control. The articles in this issue of the International journal of Operations & Production Management have been selected to illustrate current thinking and practice in relation to this situation. They are all based on papers presented to the Sixth International Conference of the Operations Management Association-UK which was held at Aston University in June 1991. The theme of the conference was "Achieving Competitive Edge" and authors from 15 countries around the world contributed to more than 80 presented papers. Within this special issue five topic areas are addressed with two articles relating to each. The topics are: strategic management of operations; managing change; production system design; production control; and service operations. Under strategic management of operations De Toni, Filippini and Forza propose a conceptual model which considers the performance of an operating system as a source of competitive advantage through the ''operation value chain'' of design, purchasing, production and distribution. Their model is set within the context of the tendency towards globalization. New's article is somewhat in contrast to the more fashionable literature on operations strategy. It challenges the validity of the current idea of ''world-class manufacturing'' and, instead, urges a reconsideration of the view that strategic ''trade-offs'' are necessary to achieve a competitive edge. The importance of managing change has for some time been recognized within the field of organization studies but its relevance in operations management is now being realized. Berger considers the use of "organization design", ''sociotechnical systems'' and change strategies and contrasts these with the more recent idea of the ''dialogue perspective''. A tentative model is suggested to improve the analysis of different strategies in a situation specific context. Neely and Wilson look at an essential prerequisite if change is to be effected in an efficient way, namely product goal congruence. Using a case study as its basis, their article suggests a method of measuring goal congruence as a means of identifying the extent to which key performance criteria relating to quality, time, cost and flexibility are understood within an organization. The two articles on production systems design represent important contributions to the debate on flexible production organization and autonomous group working. Rosander uses the results from cases to test the applicability of ''flow groups'' as the optimal way of organizing batch production. Schuring also examines cases to determine the reasons behind the adoption of ''autonomous work groups'' in The Netherlands and Sweden. Both these contributions help to provide a greater understanding of the production philosophies which have emerged as alternatives to more conventional systems -------for intermittent and continuous production. The production control articles are both concerned with the concepts of ''push'' and ''pull'' which are the two broad approaches to material planning and control. Hirakawa, Hoshino and Katayama have developed a hybrid model, suitable for multistage manufacturing processes, which combines the benefits of both systems. They discuss the theoretical arguments in support of the system and illustrate its performance with numerical studies. Slack and Correa's concern is with the flexibility characteristics of push and pull material planning and control systems. They use the case of two plants using the different systems to compare their performance within a number of predefined flexibility types. The two final contributions on service operations are complementary. The article by Voss really relates to manufacturing but examines the application of service industry concepts within the UK manufacturing sector. His studies in a number of companies support the idea of the ''service factory'' and offer a new perspective for manufacturing. Harvey's contribution by contrast, is concerned with the application of operations management principles in the delivery of professional services. Using the case of social-service provision in Canada, it demonstrates how concepts such as ''just-in-time'' can be used to improve service performance. The ten articles in this special issue of the journal address a wide range of issues and situations. Their common aspect is that, together, they demonstrate the extent to which competitiveness can be improved via the application of operations management concepts and techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Remote, non-invasive and objective tests that can be used to support expert diagnosis for Parkinson's disease (PD) are lacking. Methods: Participants underwent baseline in-clinic assessments, including the Unified Parkinson's Disease Rating Scale (UPDRS), and were provided smartphones with an Android operating system that contained a smartphone application that assessed voice, posture, gait, finger tapping, and response time. Participants then took the smart phones home to perform the five tasks four times a day for a month. Once a week participants had a remote (telemedicine) visit with a Parkinson disease specialist in which a modified (excluding assessments of rigidity and balance) UPDRS performed. Using statistical analyses of the five tasks recorded using the smartphone from 10 individuals with PD and 10 controls, we sought to: (1) discriminate whether the participant had PD and (2) predict the modified motor portion of the UPDRS. Results: Twenty participants performed an average of 2.7 tests per day (68.9% adherence) for the study duration (average of 34.4 days) in a home and community setting. The analyses of the five tasks differed between those with Parkinson disease and those without. In discriminating participants with PD from controls, the mean sensitivity was 96.2% (SD 2%) and mean specificity was 96.9% (SD 1.9%). The mean error in predicting the modified motor component of the UPDRS (range 11-34) was 1.26 UPDRS points (SD 0.16). Conclusion: Measuring PD symptoms via a smartphone is feasible and has potential value as a diagnostic support tool.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on the generation of orthogonally polarized bright–dark pulse pair in a passively mode-locked fiber laser with a large-angle tilted fiber grating (LA-TFG). The unique polarization properties of the LA-TFG, i.e., polarization-dependent loss and polarization-mode splitting, enable dual-wavelength mode-locking operation. Besides dual-wavelength bright pulses with uniform polarization at two different wavelengths, the bright–dark pulse pair has also been achieved. It is found that the bright–dark pulse pair is formed due to the nonlinear couplings between lights with two orthogonal polarizations and two different wavelengths. Furthermore, harmonic mode-locking of bright–dark pulse pair has been observed. The obtained bright–dark pulse pair could find potential use in secure communication system. It also paves the way to manipulate the generation of dark pulse in terms of wavelength and polarization, using specially designed fiber grating for mode-locking.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Investigates the degree of global standardisation of a corporate visual identity system (CVIS) in multinational operations. A special emphasis of this research is accorded to UK companies operating in Malaysia. In particular, the study seeks to reveal the reasons for developing a standardised CVIS; the behavioural issues associated with CVIS; and the determination in selecting a graphic design agency. The findings of the research revealed that multinational corporations in an increasingly corporate environment adopted a standardised CVIS for several reasons, including, aiding the sale of products and services, creating an attractive environment for hiring employees, and increasing the company’s stature and presence. Further findings show that the interest in global identity was stimulated by global restructuring, merger or acquisition. The above trends help explain why increased focus has been accorded to CVIS over the past five years by many UK companies operating in Malaysia. Additional findings reveal that both the UK design agencies and in-house design department are used in the development of the firms’ CVIS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In data envelopment analysis (DEA), operating units are compared on their outputs relative to their inputs. The identification of an appropriate input-output set is of decisive significance if assessment of the relative performance of the units is not to be biased. This paper reports on a novel approach used for identifying a suitable input-output set for assessing central administrative services at universities. A computer-supported group support system was used with an advisory board to enable the analysts to extract information pertaining to the boundaries of the unit of assessment and the corresponding input-output variables. The approach provides for a more comprehensive and less inhibited discussion of input-output variables to inform the DEA model. © 2005 Operational Research Society Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel quasidistributed in-fiber Bragg grating (FBG) temperature sensor system has been developed for temperature proving in vivo in the human body for medical applications, e.g., hyperthermia treatment. This paper provides the operating principle of FBG temperature sensors and then the design of the sensor head. High-resolution detection of the wavelength-shifts induced by temperature changes are achieved using drift-compensated interferometric detection while the return signals from the FBG sensor array are demultiplexed with a simple monochromator which offers crosstalk-free wavelength-division-multiplexing (WDM). A “strain-free” probe is designed by enclosing the FBG sensor array in a protection sleeve. A four FBG sensor system is demonstrated and the experimental results are in good agreement with those obtained by traditional electrical thermocouple sensors. A resolution of 0.1°C and an accuracy of ±0.2°C over a temperature range of 30-60°C have been achieved, which meet established medical requirements.