40 resultados para Ease of Programming

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In analysing manufacturing systems, for either design or operational reasons, failure to account for the potentially significant dynamics could produce invalid results. There are many analysis techniques that can be used, however, simulation is unique in its ability to assess detailed, dynamic behaviour. The use of simulation to analyse manufacturing systems would therefore seem appropriate if not essential. Many simulation software products are available but their ease of use and scope of application vary greatly. This is illustrated at one extreme by simulators which offer rapid but limited application whilst at the other simulation languages which are extremely flexible but tedious to code. Given that a typical manufacturing engineer does not posses in depth programming and simulation skills then the use of simulators over simulation languages would seem a more appropriate choice. Whilst simulators offer ease of use their limited functionality may preclude their use in many applications. The construction of current simulators makes it difficult to amend or extend the functionality of the system to meet new challenges. Some simulators could even become obsolete as users, demand modelling functionality that reflects the latest manufacturing system design and operation concepts. This thesis examines the deficiencies in current simulation tools and considers whether they can be overcome by the application of object-oriented principles. Object-oriented techniques have gained in popularity in recent years and are seen as having the potential to overcome any of the problems traditionally associated with software construction. There are a number of key concepts that are exploited in the work described in this thesis: the use of object-oriented techniques to act as a framework for abstracting engineering concepts into a simulation tool and the ability to reuse and extend object-oriented software. It is argued that current object-oriented simulation tools are deficient and that in designing such tools, object -oriented techniques should be used not just for the creation of individual simulation objects but for the creation of the complete software. This results in the ability to construct an easy to use simulator that is not limited by its initial functionality. The thesis presents the design of an object-oriented data driven simulator which can be freely extended. Discussion and work is focused on discrete parts manufacture. The system developed retains the ease of use typical of data driven simulators. Whilst removing any limitation on its potential range of applications. Reference is given to additions made to the simulator by other developers not involved in the original software development. Particular emphasis is put on the requirements of the manufacturing engineer and the need for Ihe engineer to carrv out dynamic evaluations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Projects exposed to an uncertain environment must be adapted to deal with the effective integration of various planning elements and the optimization of project parameters. Time, cost, and quality are the prime objectives of a project that need to be optimized to fulfill the owner's goal. In an uncertain environment, there exist many other conflicting objectives that may also need to be optimized. These objectives are characterized by varying degrees of conflict. Moreover, an uncertain environment also causes several changes in the project plan throughout its life, demanding that the project plan be totally flexible. Goal programming (GP), a multiple criteria decision making technique, offers a good solution for this project planning problem. There the planning problem is considered from the owner's perspective, which leads to classifying the project up to the activity level. GP is applied separately at each level, and the formulated models are integrated through information flow. The flexibility and adaptability of the models lies in the ease of updating the model parameters at the required level through changing priorities and/or constraints and transmitting the information to other levels. The hierarchical model automatically provides integration among various element of planning. The proposed methodology is applied in this paper to plan a petroleum pipeline construction project, and its effectiveness is demonstrated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The naturally occurring reactive electrophilic species 12-oxo-phytodienoic acid (12-oxo-PDA) is a potent antifungal agent, whereas the plant growth regulator jasmonic acid, which is synthesized from 12-oxo-PDA, is ineffective. To address what structural features of the molecule endow it with antifungal activity, we synthesized a series of molecular mimics of 12-oxo-PDA varying in the length of the alkyl chain at its C-4 ring position. The octyl analogue (4-octyl cyclopentenone) was the most effective at suppressing spore germination and subsequent mycelial growth of a range of fungal pathogens and was particularly effective against Cladosporium herbarum and Botrytis cinerea, with minimum fungicidal concentrations in the range 100-200 µM. Introduction of a carboxyl group to the end of the chain, mimicking natural fatty acids, markedly reduced antifungal efficacy. Electrolyte leakage, indicative of membrane perturbation, was evident in both C. herbarum and B. cinerea exposed to 4-octyl cyclopentenone. Lipid composition analysis of the fungal spores revealed that those species with a high oil content, namely Fusarium oxysporum and Alternaria brassicicola, were less sensitive to 4-octyl cyclopentenone. The comparable hydrophobicity of 4-octyl cyclopentenone and 12-oxo-PDA accounts for the similar spore suppression activity of these two compounds. The relative ease of synthesis of 4-octyl cyclopentenone makes it an attractive compound for potential use as an antifungal agent. © 2011 SGM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Solid dispersions can be used to improve dissolution of poorly soluble drugs and PVP is a common polymeric carrier in such systems. The mechanisms controlling release of drug from solid dispersions are not fully understood and proposed theories are dependent on an understanding of the dissolution behaviour of both components of the dispersion. This study uses microviscometry to measure small changes in the viscosity of the dissolution medium as the polymer dissolves from ibuprofen-PVP solid dispersions. The microviscometer determines the dynamic and kinematic viscosity of liquids based on the rolling/falling ball principle. Using a standard USP dissolution apparatus, the dissolution of the polymer from the solid dispersion was easily measured alongside drug release. Drug release was found to closely follow polymer dissolution at the molecular weights and ratios used. The combination of sensitivity and ease of use make microviscometry a valuable technique for the elucidation of mechanisms governing drug release from polymeric delivery systems. © 2004 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to examine consumer emotions and the social science and observation measures that can be utilised to capture the emotional experiences of consumers. The paper is not setting out to solve the theoretical debate surrounding emotion research, rather to provide an assessment of methodological options available to researchers to aid their investigation into both the structure and content of the consumer emotional experience, acknowledging both the conscious and subconscious elements of that experience. Design/methodology/approach - A review of a wide range of prior research from the fields of marketing, consumer behaviour, psychology and neuroscience are examined to identify the different observation methods available to marketing researchers in the study of consumer emotion. This review also considers the self report measures available to researchers and identifies the main theoretical debates concerning emotion to provide a comprehensive overview of the issues surrounding the capture of emotional responses in a marketing context and to highlight the benefits that observation methods offer this area of research. Findings - This paper evaluates three observation methods and four widely used self report measures of emotion used in a marketing context. Whilst it is recognised that marketers have shown preference for the use of self report measures in prior research, mainly due to ease of implementation, it is posited that the benefits of observation methodology and the wealth of data that can be obtained using such methods can compliment prior research. In addition, the use of observation methods cannot only enhance our understanding of the consumer emotion experience but also enable us to collaborate with researchers from other fields in order to make progress in understanding emotion. Originality/value - This paper brings perspectives and methods together to provide an up to date consideration of emotion research for marketers. In order to generate valuable research in this area there is an identified need for discussion and implementation of the observation techniques available to marketing researchers working in this field. An evaluation of a variety of methods is undertaken as a point to start discussion or consideration of different observation techniques and how they can be utilised.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to its wide applicability and ease of use, the analytic hierarchy process (AHP) has been studied extensively for the last 20 years. Recently, it is observed that the focus has been confined to the applications of the integrated AHPs rather than the stand-alone AHP. The five tools that commonly combined with the AHP include mathematical programming, quality function deployment (QFD), meta-heuristics, SWOT analysis, and data envelopment analysis (DEA). This paper reviews the literature of the applications of the integrated AHPs. Related articles appearing in the international journals from 1997 to 2006 are gathered and analyzed so that the following three questions can be answered: (i) which type of the integrated AHPs was paid most attention to? (ii) which area the integrated AHPs were prevalently applied to? (iii) is there any inadequacy of the approaches? Based on the inadequacy, if any, some improvements and possible future work are recommended. This research not only provides evidence that the integrated AHPs are better than the stand-alone AHP, but also aids the researchers and decision makers in applying the integrated AHPs effectively.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Globalization has become one of the most important topics within politics and economics. This new title explains some of the related terminology, summarizes the surrounding theories and examines the international organizations involved. With the proliferation of communications and the rise of the multi-national corporation, the concept of globalization is vitally important to the modern political environment. The structure of the modern economy, based on information production and diffusion, has made national boundaries largely irrelevant. A Dictionary of Globalization explains theories, philosophies and ideologies, and includes short biographies of leading activists, theorists and thinkers such as Noam Chomsky, Karl Marx and José Bové. Concepts, issues and terms key to the understanding of globalization also have clear and concise definitions, including democracy, civil society, non-governmental organizations and ethnicity. Cross-referenced for ease of use, this title aims to be of great benefit to anyone studying politics or sociology. It will prove essential to public and academic libraries, as well as to businesses, government departments, embassies and journalists.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

2-(2-pyridyl)phenyl(p-ethoxyphenyl)tellurium(II), (RR1Te) reacts with HgC12 at room temperature to give white HgCl2.RR1Te. On setting aside, or on warming the reaction mixture a yellow material, [R1HgCl.(RTeCl)2] is formed. Multinuclear NMR(125Te, 199Hg, 1H) and mass spectroscopy confirm the formulation, and confirm the ease of transfer of the p-ethoxyphenyl group (R1) between the metal centres. The crystal structure of the yellow material consists of two discrete RTeCl molecules together with a R1HgCl molecule. There is no dative bond formation between these species, hence the preferred description of the formation of an inclusion complex. The reaction of RR1Te with Copper(I) chloride in the cold gives an air sensitive yellow product Cu3Cl3(RR1Te)2(0.5CH3CN); under reflux in air changes to the green Cu2Cl(RR1Te)(0.5 EtOH). By contrast, the reaction of RR1Te with acetonitrile solution of Copper(II) salts under mild conditions affords the white materials CuCl(RR1Te) and CuBr(RR1Te)H2O. RR1Te reacts with PdCl2 and PtCl2 to give materials albeit not well defined, can be seen as intermediates to the synthesis of inorganic phase of the type M3XTe2XCl2X. Paramagnetism is associated with some of the palladium and platinum products. The 195Pt NMR measurement in DMSO establishes the presence of six platinum species, which are assigned to Pt(IV), Pt(III) or Pt(II). The reactions show that in the presence of PdCl2 or PtCl2 both R and R1 are very labile. The reaction of RHgCl(R= 2-(2-pyridyl)phenyl) with SeX4(X= Cl, Br) gives compounds which suggest that both Trans-metallation and redox processes are involved. By varying reaction conditions materials which appear to be intermediates in the trans-metallation process are isolated. Potentially bidentate tellurium ligands having molecular formula RTe(CH2)nTeR,Ln, (R= Ph,(t-Bu). C6H4, n = 5,10) are prepared. Palladium and Platinum complexes containing these ligands are prepared. Also complex Ph3SnC1L(L = p-EtO.C6H4) is prepared.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The spatial patterns of discrete beta-amyloid (Abeta) deposits in brain tissue from patients with Alzheimer disease (AD) were studied using a statistical method based on linear regression, the results being compared with the more conventional variance/mean (V/M) method. Both methods suggested that Abeta deposits occurred in clusters (400 to <12,800 mu m in diameter) in all but 1 of the 42 tissues examined. In many tissues, a regular periodicity of the Abeta deposit clusters parallel to the tissue boundary was observed. In 23 of 42 (55%) tissues, the two methods revealed essentially the same spatial patterns of Abeta deposits; in 15 of 42 (36%), the regression method indicated the presence of clusters at a scale not revealed by the V/M method; and in 4 of 42 (9%), there was no agreement between the two methods. Perceived advantages of the regression method are that there is a greater probability of detecting clustering at multiple scales, the dimension of larger Abeta clusters can be estimated more accurately, and the spacing between the clusters may be estimated. However, both methods may be useful, with the regression method providing greater resolution and the V/M method providing greater simplicity and ease of interpretation. Estimates of the distance between regularly spaced Abeta clusters were in the range 2,200-11,800 mu m, depending on tissue and cluster size. The regular periodicity of Abeta deposit clusters in many tissues would be consistent with their development in relation to clusters of neurons that give rise to specific neuronal projections.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research has two focal points: experiences of stigma and experiences of formal support services among teenage mothers. Twenty teenage mothers were interviewed in depth, ten from a one-to-one support service, and ten from a group based support service. Contributions to knowledge consisted of the following. First, regarding experiences of stigma, this research integrated concepts from the social psychology literature and established the effects of stigma which are experienced by teenage mothers, offering reasons for the same. Additionally, further coping mechanisms in response to being stigmatized were discovered and grouped into two new headings: active and passive coping mechanisms. It is acknowledged that for a minority of participants, stigma does have negative effects, however, the majority experiences no such serious negative effects. Secondly, regarding experiences of support services, this research was able to directly compare one-to-one with group based support for teenage mothers. Knowledge was unearthed as to influential factors in the selection of a mode of support and the functions of each of the modes of support, which were categorised under headings for ease of comparison. It was established that there is indeed a link between these two research foci in that both the one-to-one and group based support services fulfil a stigma management function, in which teenage mothers discuss the phenomenon, share experiences and offer advice to others. However, it was also established that this function is of minor importance compared to the other functions fulfilled by the support services.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is an evaluation of practices to control antibiotic prescribing in UK NHS hospitals. Within the past ten years there has been increasing international concern about escalating antibiotic resistance, and the UK has issued several policy documents for pmdent antibiotic prescribing. Chief Pharmacists in 253 UK NHS hospitals were surveyed about the availability and nature of documents to control antibiotic prescribing (formularies, policies and guidelines), and the role of pharmacists and medical microbiologists in monitoring prescribers' compliance with the recommendations of such documents. Although 235 hospitals had at least one document, only 60% had both an antibiotic formulary and guidelines, and only about one-half planned an annual revision of document(s). Pharmacists were reported as mostly checking antibiotic prescribing on every ward whilst medical microbiologists mostly visited selected units only. Response to a similar questionnaire was obtained from the Chief Medical Microbiologists in 131 UK NHS hospitals. Comparisons of the questionnaires indicated areas of apparent disagreement about the roles of pharmacists and medical microbiologists. Eighty three paired-responses received from pharmacists and medical microbiologists in the same hospital revealed poor agreement and awareness about controls. A total of 205 institutional prescribing guidelines were analysed for recommendations for the empirical antibiotic prescribing of Community-Acquired Pneumonia (CAP). Variation was observed in recommendations and agreement with national guidance from the British Thoracic Society (BTS). A questionnaire was subsequently sent to 235 Chief Pharmacists to investigate their awareness of this new guidance from the BTS, and subsequent revision of institutional guidelines. Documents had been revised in only about one-half of hospitals where pharmacists were aware of the new guidance. An audit of empirical antibiotic prescribing practices for CAP was performed at one hospital. Although problems were experienced with retrieval of medical records, diagnostic criteria were poorly recorded, and only 57% of prescribing for non-severe CAP was compliant with institutional guidelines. A survey of clinicians at the same hospital identified that almost one-half used the institutional guidelines and most found them useful. However, areas for improvement concernmg awareness of the guidelines and ease of access were identified. It is important that hospitals are equipped to react to changes in the hospital environment including frequent movement of junior doctors between institutions, the employment of specialist "infectious diseases pharmacists" and the increasing benefits offered by information technology. Recommendations for policy have been suggested.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.