895 resultados para Ease of Programming
Resumo:
Due to its wide applicability and ease of use, the analytic hierarchy process (AHP) has been studied extensively for the last 20 years. Recently, it is observed that the focus has been confined to the applications of the integrated AHPs rather than the stand-alone AHP. The five tools that commonly combined with the AHP include mathematical programming, quality function deployment (QFD), meta-heuristics, SWOT analysis, and data envelopment analysis (DEA). This paper reviews the literature of the applications of the integrated AHPs. Related articles appearing in the international journals from 1997 to 2006 are gathered and analyzed so that the following three questions can be answered: (i) which type of the integrated AHPs was paid most attention to? (ii) which area the integrated AHPs were prevalently applied to? (iii) is there any inadequacy of the approaches? Based on the inadequacy, if any, some improvements and possible future work are recommended. This research not only provides evidence that the integrated AHPs are better than the stand-alone AHP, but also aids the researchers and decision makers in applying the integrated AHPs effectively.
Resumo:
Globalization has become one of the most important topics within politics and economics. This new title explains some of the related terminology, summarizes the surrounding theories and examines the international organizations involved. With the proliferation of communications and the rise of the multi-national corporation, the concept of globalization is vitally important to the modern political environment. The structure of the modern economy, based on information production and diffusion, has made national boundaries largely irrelevant. A Dictionary of Globalization explains theories, philosophies and ideologies, and includes short biographies of leading activists, theorists and thinkers such as Noam Chomsky, Karl Marx and José Bové. Concepts, issues and terms key to the understanding of globalization also have clear and concise definitions, including democracy, civil society, non-governmental organizations and ethnicity. Cross-referenced for ease of use, this title aims to be of great benefit to anyone studying politics or sociology. It will prove essential to public and academic libraries, as well as to businesses, government departments, embassies and journalists.
Resumo:
2-(2-pyridyl)phenyl(p-ethoxyphenyl)tellurium(II), (RR1Te) reacts with HgC12 at room temperature to give white HgCl2.RR1Te. On setting aside, or on warming the reaction mixture a yellow material, [R1HgCl.(RTeCl)2] is formed. Multinuclear NMR(125Te, 199Hg, 1H) and mass spectroscopy confirm the formulation, and confirm the ease of transfer of the p-ethoxyphenyl group (R1) between the metal centres. The crystal structure of the yellow material consists of two discrete RTeCl molecules together with a R1HgCl molecule. There is no dative bond formation between these species, hence the preferred description of the formation of an inclusion complex. The reaction of RR1Te with Copper(I) chloride in the cold gives an air sensitive yellow product Cu3Cl3(RR1Te)2(0.5CH3CN); under reflux in air changes to the green Cu2Cl(RR1Te)(0.5 EtOH). By contrast, the reaction of RR1Te with acetonitrile solution of Copper(II) salts under mild conditions affords the white materials CuCl(RR1Te) and CuBr(RR1Te)H2O. RR1Te reacts with PdCl2 and PtCl2 to give materials albeit not well defined, can be seen as intermediates to the synthesis of inorganic phase of the type M3XTe2XCl2X. Paramagnetism is associated with some of the palladium and platinum products. The 195Pt NMR measurement in DMSO establishes the presence of six platinum species, which are assigned to Pt(IV), Pt(III) or Pt(II). The reactions show that in the presence of PdCl2 or PtCl2 both R and R1 are very labile. The reaction of RHgCl(R= 2-(2-pyridyl)phenyl) with SeX4(X= Cl, Br) gives compounds which suggest that both Trans-metallation and redox processes are involved. By varying reaction conditions materials which appear to be intermediates in the trans-metallation process are isolated. Potentially bidentate tellurium ligands having molecular formula RTe(CH2)nTeR,Ln, (R= Ph,(t-Bu). C6H4, n = 5,10) are prepared. Palladium and Platinum complexes containing these ligands are prepared. Also complex Ph3SnC1L(L = p-EtO.C6H4) is prepared.
Resumo:
Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.
Spatial pattern analysis of beta-amyloid (A beta) deposits in Alzheimer disease by linear regression
Resumo:
The spatial patterns of discrete beta-amyloid (Abeta) deposits in brain tissue from patients with Alzheimer disease (AD) were studied using a statistical method based on linear regression, the results being compared with the more conventional variance/mean (V/M) method. Both methods suggested that Abeta deposits occurred in clusters (400 to <12,800 mu m in diameter) in all but 1 of the 42 tissues examined. In many tissues, a regular periodicity of the Abeta deposit clusters parallel to the tissue boundary was observed. In 23 of 42 (55%) tissues, the two methods revealed essentially the same spatial patterns of Abeta deposits; in 15 of 42 (36%), the regression method indicated the presence of clusters at a scale not revealed by the V/M method; and in 4 of 42 (9%), there was no agreement between the two methods. Perceived advantages of the regression method are that there is a greater probability of detecting clustering at multiple scales, the dimension of larger Abeta clusters can be estimated more accurately, and the spacing between the clusters may be estimated. However, both methods may be useful, with the regression method providing greater resolution and the V/M method providing greater simplicity and ease of interpretation. Estimates of the distance between regularly spaced Abeta clusters were in the range 2,200-11,800 mu m, depending on tissue and cluster size. The regular periodicity of Abeta deposit clusters in many tissues would be consistent with their development in relation to clusters of neurons that give rise to specific neuronal projections.
Resumo:
This research has two focal points: experiences of stigma and experiences of formal support services among teenage mothers. Twenty teenage mothers were interviewed in depth, ten from a one-to-one support service, and ten from a group based support service. Contributions to knowledge consisted of the following. First, regarding experiences of stigma, this research integrated concepts from the social psychology literature and established the effects of stigma which are experienced by teenage mothers, offering reasons for the same. Additionally, further coping mechanisms in response to being stigmatized were discovered and grouped into two new headings: active and passive coping mechanisms. It is acknowledged that for a minority of participants, stigma does have negative effects, however, the majority experiences no such serious negative effects. Secondly, regarding experiences of support services, this research was able to directly compare one-to-one with group based support for teenage mothers. Knowledge was unearthed as to influential factors in the selection of a mode of support and the functions of each of the modes of support, which were categorised under headings for ease of comparison. It was established that there is indeed a link between these two research foci in that both the one-to-one and group based support services fulfil a stigma management function, in which teenage mothers discuss the phenomenon, share experiences and offer advice to others. However, it was also established that this function is of minor importance compared to the other functions fulfilled by the support services.
Resumo:
This thesis is an evaluation of practices to control antibiotic prescribing in UK NHS hospitals. Within the past ten years there has been increasing international concern about escalating antibiotic resistance, and the UK has issued several policy documents for pmdent antibiotic prescribing. Chief Pharmacists in 253 UK NHS hospitals were surveyed about the availability and nature of documents to control antibiotic prescribing (formularies, policies and guidelines), and the role of pharmacists and medical microbiologists in monitoring prescribers' compliance with the recommendations of such documents. Although 235 hospitals had at least one document, only 60% had both an antibiotic formulary and guidelines, and only about one-half planned an annual revision of document(s). Pharmacists were reported as mostly checking antibiotic prescribing on every ward whilst medical microbiologists mostly visited selected units only. Response to a similar questionnaire was obtained from the Chief Medical Microbiologists in 131 UK NHS hospitals. Comparisons of the questionnaires indicated areas of apparent disagreement about the roles of pharmacists and medical microbiologists. Eighty three paired-responses received from pharmacists and medical microbiologists in the same hospital revealed poor agreement and awareness about controls. A total of 205 institutional prescribing guidelines were analysed for recommendations for the empirical antibiotic prescribing of Community-Acquired Pneumonia (CAP). Variation was observed in recommendations and agreement with national guidance from the British Thoracic Society (BTS). A questionnaire was subsequently sent to 235 Chief Pharmacists to investigate their awareness of this new guidance from the BTS, and subsequent revision of institutional guidelines. Documents had been revised in only about one-half of hospitals where pharmacists were aware of the new guidance. An audit of empirical antibiotic prescribing practices for CAP was performed at one hospital. Although problems were experienced with retrieval of medical records, diagnostic criteria were poorly recorded, and only 57% of prescribing for non-severe CAP was compliant with institutional guidelines. A survey of clinicians at the same hospital identified that almost one-half used the institutional guidelines and most found them useful. However, areas for improvement concernmg awareness of the guidelines and ease of access were identified. It is important that hospitals are equipped to react to changes in the hospital environment including frequent movement of junior doctors between institutions, the employment of specialist "infectious diseases pharmacists" and the increasing benefits offered by information technology. Recommendations for policy have been suggested.
Resumo:
Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.
Resumo:
Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.
Resumo:
A description of the background to testing friction materials for automotive brakes explains the need for a rapid, inexpensive means of assessing their behaviour in a way which is both accurate and meaningful. Various methods of controlling inertia dynamometers to simulate road vehicles are rejected in favour of programming by means of a commercially available XY plotter. Investigation of brake service conditions is used to set up test schedules, and a dynamometer programming unit built to enable service conditions on vehicles to be simulated on a full scale dynamometer. A technique is developed by which accelerated testing can be achieved without operating under overload conditions, saving time and cost without sacrificing validity. The development of programming by XY plotter is described, with a method of operating one XY plotter to programme the machine, monitor its own behaviour, and plot its own results in logical sequence. Commissioning trials are described and the generation of reproducible results in frictional behaviour and material durability is discussed. Teclmiques are developed to cross check the operation of the machine in retrospect, and retrospectively correct results in the event of malfunctions. Sensitivity errors in the measuring circuits are displayed between calibrations, whilst leaving the recorded results almost unaffected by error. Typical results of brake lining tests are used to demonstrate the range of performance parameters which can be studied by use of the machine. Successful test investigations completed on the machine are reported, including comments on behaviour of cast iron drums and discs. The machine shows that materials can repeat their complex friction/ temperature/speed/pressure relationships at a reproducibility of the order of +-0.003u and +~ 0.0002 in. thickness loss during wear tests. Discussion of practical and academic implications completes the report with recommendations for further work in both fields.
Resumo:
Lithofacies distribution indicates that the Much Wenlock Limestone Formation of England and South Wales was desposited on a shelf which was flat and gently subsiding in the north, but topographically variable in the south. Limestone deposition in the north began with 12m of alga-rich limestone, which formed an upward shoaling sequence. Deepening then led to deposition of calcareous silty mudstones on the northern shelf. The remainder of the formation in this area formed during a shelf-wide regression, culminating in the production of an E to W younging sandbody. Lithofacies distribution on the southern shelf was primarily controlled by local subsidence. Six bedded lithofacies are recognised which contain 14 brachiopod/bryozoan dominated assemblages, of which 11 are in situ and three consist of reworked fossils. Microfacies analysis is necessary to distinguish assemblages which reflect original communities from those which reflect sedimentary processes. Turbulence, substrate-type, ease of feeding and other organisms in the environment controlled faunal distribution. Reefs were built dominantly by corals, stromatoporoids, algae and crinoids. Coral/stromatoporoid (Type A) reefs are common, particularly on the northern shelf, where they formed in response to shallowing, ultimately growing in front of the advancing carbonate sandbody. Algae dominate Type B and Type C reefs, reflecting growth in areas of poor water circulation. Lithification of the formation began in the marine-phreatic environment with precipitation of aragonite and high Mg calcite, which was subsequently altered to turbid low Mg calcite. Younger clear spars post-date secondary void formation. The pre-compactional clear spars have features which resemble the products of meteoric water diagenesis, but freshwater did not enter the formation at this time. The pre-compactional spars were precipitated by waters forced from the surrounding silty mudstones at shallow burial depths. Late diagenetic products are stylolites, compaction fractures and burial cements.
Resumo:
In recent years, freshwater fish farmers have come under increasing pressure from the Water Authorities to control the quality of their farm effluents. This project aimed to investigate methods of treating aquacultural effluent in an efficient and cost-effective manner, and to incorporate the knowledge gained into an Expert System which could then be used in an advice service to farmers. From the results of this research it was established that sedimentation and the use of low pollution diets are the only cost effective methods of controlling the quality of fish farm effluents. Settlement has been extensively investigated and it was found that the removal of suspended solids in a settlement pond is only likely to be effective if the inlet solids concentration is in excess of 8 mg/litre. The probability of good settlement can be enhanced by keeping the ratio of length/retention time (a form of mean fluid velocity) below 4.0 metres/minute. The removal of BOD requires inlet solids concentrations in excess of 20 mg/litre to be effective, and this is seldom attained on commercial fish farms. Settlement, generally, does not remove appreciable quantities of ammonia from effluents, but algae can absorb ammonia by nutrient uptake under certain conditions. The use of low pollution, high performance diets gives pollutant yields which are low when compared with published figures obtained by many previous workers. Two Expert Systems were constructed, both of which diagnose possible causes of poor effluent quality on fish farms and suggest solutions. The first system uses knowledge gained from a literature review and the second employs the knowledge obtained from this project's experimental work. Consent details for over 100 fish farms were obtained from the public registers kept by the Water Authorities. Large variations in policy from one Authority to the next were found. These data have been compiled in a computer file for ease of comparison.
Resumo:
Differential perception of innovation is a research area which has been advocated as a suitable topic for study in recent years. It developed from the problems encountered within earlier perception of innovation studies which sought to establish what characteristics of an innovation affected the ease of its adoption. While some success was achieved In relating perception of innovation to adoption behaviour, variability encountered Within groups expected - to fercelve innovation similarly suggested that the needs and experiences of the potential adopter were significantly affecting the research findings. Such analysis being supported by both sociological and psychological perceptual research. The present study sought to identify the presence of differential perception of innovation and explore the nature of the process. It was decided to base the research in an organisational context and to concentrate upon manufacturing innovation. It has been recognised that such adoption of technological innovation is commonly the product of a collective decision-making process, involving individuals from a variety of occupational backgrounds, both in terms of occupational speciality and level within the hierarchy. Such roles appeared likely to significantly influence perception of technological innovation, as gathered through an appropriate measure and were readily identifiable. Data vas collected by means of a face-to-face card presentation technique, a questionnaire and through case study material. Differential perception of innovation effects were apparent In the results, many similarities and differences of perception being related to the needs and experiences of the individuals studied. Phenomenological analysis, which recognises the total nature of experience in infiuencing behaviour, offered the best means of explaining the findings. It was also clear that the bureaucratic model of role definition was not applicable to the area studied, it seeming likely that such definitions are weaker under conditions of uncertainty, such as encountered in innovative decision-making.