117 resultados para Microcomputer


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study on heat pump thermodynamic characteristics has been made in the laboratory on a specially designed and instrumented air to water heat pump system. The design, using refrigerant R12, was based on the requirement to produce domestic hot water at a temperature of about 50 °C and was assembled in the laboratory. All the experimental data were fed to a microcomputer and stored on disk automatically from appropriate transducers via amplifier and 16 channel analogue to digital converters. The measurements taken were R12 pressures and temperatures, water and R12 mass flow rates, air speed, fan and compressor input powers, water and air inlet and outlet temperatures, wet and dry bulb temperatures. The time interval between the observations could be varied. The results showed, as expected, that the COP was higher at higher air inlet temperatures and at lower hot water output temperatures. The optimum air speed was found to be at a speed when the fan input power was about 4% of the condenser heat output. It was also found that the hot water can be produced at a temperature higher than the appropriate R12 condensing temperature corresponding to condensing pressure. This was achieved by condenser design to take advantage of discharge superheat and by further heating the water using heat recovery from the compressor. Of the input power to the compressor, typically about 85% was transferred to the refrigerant, 50 % by the compression work and 35% due to the heating of the refrigerant by the cylinder wall, and the remaining 15% (of the input power) was rejected to the cooling medium. The evaporator effectiveness was found to be about 75% and sensitive to the air speed. Using the data collected, a steady state computer model was developed. For given input conditions s air inlet temperature, air speed, the degree of suction superheat , water inlet and outlet temperatures; the model is capable of predicting the refrigerant cycle, compressor efficiency, evaporator effectiveness, condenser water flow rate and system Cop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current analytical assay methods for ampicillin sodium and cloxacillin sodium are discussed and compared, High Performance Liquid Chromatography (H.P.L.C.) being chosen as the most accurate, specific and precise. New H.P.L.C. methods for the analysis of benzathine cloxacillin; benzathine penicillin V; procaine penicillin injection B.P.; benethamine penicillin injection; fortified B.P.C.; benzathine penicillin injection; benzathine penicillin injection, fortified B.P.C.; benzathine penicillin suspnsion; ampicillin syrups and penicillin syrups are described. Mechanical or chemical damage to column packings is often associated with H.P.L.C. analysis. One type, that of channel formation, is investigated. The high linear velocity of solvent and solvent pulsing during the pumping cycle were found to be the cause of this damage. The applicability of nonisotherrnal kinetic experiments to penicillin V preparations, including formulated paediatric syrups, is evaluated. A new type of nonisotherrnal analysis, based on slope estimation and using a 64K Random Access Memory (R.A.M.) microcomputer is described. The name of the program written for this analysis is NONISO. The distribution of active penicillin in granules for reconstitution into ampicillin and penicillin V syrups, and its effect on the stability of the reconstituted products, are investigated. Changing the diluent used to reconstitue the syrups was found to affect the stability of the product. Dissolution and stability of benzathine cloxacillin at pH2, pH6 and pH9 is described, with proposed dissolution mechanisms and kinetic analysis to support these mechanisms. Benzathine and cloxacillin were found to react in solution at pH9, producing an insoluble amide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Communication and portability are the two main problems facing the user. An operating system, called PORTOS, was developed to solve these problems for users on dedicated microcomputer systems. Firstly, an interface language was defined, according to the anticipated requirements and behaviour of its potential users. Secondly, the PORTOS operating system was developed as a processor for this language. The system is currently running on two minicomputers of highly different architectures. PORTOS achieves its portability through its high-level design, and implementation in CORAL66. The interface language consists of a set of user cotnmands and system responses. Although only a subset has been implemented, owing to time and manpower constraints, promising results were achieved regarding the usability of the language, and its portability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The civil engineering industry generally regards new methods and technology with a high amount of scepticism, preferring to use traditional and trusted methods. During the 1980s competition for civil engineering consultancy work in the world has become fierce. Halcrow recognised the need to maintain and improve their competitive edge over other consultants. The use of new technology in the form of microcomputers was seen to be one method to maintain and improve their repuation in the world. This thesis examines the role of microcomputers in civil engineering consultancy with particular reference to overseas projects. The involvement of civil engineers with computers, both past and present, has been investigated and a survey of the use of microcomputers by consultancies was carried out, the results are presented and analysed. A resume of the state-of-the-art of microcomputer technology was made. Various case studies were carried out in order to examine the feasibility of using microcomputers on overseas projects. One case study involved the examination of two projects in Bangladesh and is used to illustrate the requirements and problems encountered in such situations. Two programming applications were undertaken, a dynamic programming model of a single site reservoir and the simulation of the Bangladesh gas grid system. A cost-benefit analysis of a water resources project using microcomputers in the Aguan Valley, Honduras was carried out. Although the initial cost of microcomputers is often small, the overall costs can prove to be very high and are likely to exceed the costs of traditional computer methods. A planned approach for the use of microcomputers is essential in order to reap the expected benefits and recommendations for the implementation of such an approach are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lyophilisation or freeze drying is the preferred dehydrating method for pharmaceuticals liable to thermal degradation. Most biologics are unstable in aqueous solution and may use freeze drying to prolong their shelf life. Lyophilisation is however expensive and has seen lots of work aimed at reducing cost. This thesis is motivated by the potential cost savings foreseen with the adoption of a cost efficient bulk drying approach for large and small molecules. Initial studies identified ideal formulations that adapted well to bulk drying and further powder handling requirements downstream in production. Low cost techniques were used to disrupt large dried cakes into powder while the effects of carrier agent concentration were investigated for powder flowability using standard pharmacopoeia methods. This revealed superiority of crystalline mannitol over amorphous sucrose matrices and established that the cohesive and very poor flow nature of freeze dried powders were potential barriers to success. Studies from powder characterisation showed increased powder densification was mainly responsible for significant improvements in flow behaviour and an initial bulking agent concentration of 10-15 %w/v was recommended. Further optimisation studies evaluated the effects of freezing rates and thermal treatment on powder flow behaviour. Slow cooling (0.2 °C/min) with a -25°C annealing hold (2hrs) provided adequate mechanical strength and densification at 0.5-1 M mannitol concentrations. Stable bulk powders require powder transfer into either final vials or intermediate storage closures. The targeted dosing of powder formulations using volumetric and gravimetric powder dispensing systems where evaluated using Immunoglobulin G (IgG), Lactate Dehydrogenase (LDH) and Beta Galactosidase models. Final protein content uniformity in dosed vials was assessed using activity and protein recovery assays to draw conclusions from deviations and pharmacopeia acceptance values. A correlation between very poor flowability (p<0.05), solute concentration, dosing time and accuracy was revealed. LDH and IgG lyophilised in 0.5 M and 1 M mannitol passed Pharmacopeia acceptance values criteria with 0.1-4 while formulations with micro collapse showed the best dose accuracy (0.32-0.4% deviation). Bulk mannitol content above 0.5 M provided no additional benefits to dosing accuracy or content uniformity of dosed units. This study identified considerations which included the type of protein, annealing, cake disruption process, physical form of the phases present, humidity control and recommended gravimetric transfer as optimal for dispensing powder. Dosing lyophilised powders from bulk was demonstrated as practical, time efficient, economical and met regulatory requirements in cases. Finally the use of a new non-destructive technique, X-ray microcomputer tomography (MCT), was explored for cake and particle characterisation. Studies demonstrated good correlation with traditional gas porosimetry (R2 = 0.93) and morphology studies using microscopy. Flow characterisation from sample sizes of less than 1 mL was demonstrated using three dimensional X-ray quantitative image analyses. A platinum-mannitol dispersion model used revealed a relationship between freezing rate, ice nucleation sites and variations in homogeneity within the top to bottom segments of a formulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In - Appraising Work Group Performance: New Productivity Opportunities in Hospitality Management – a discussion by Mark R. Edwards, Associate Professor, College of Engineering, Arizona State University and Leslie Edwards Cummings, Assistant Professor, College of Hotel Administration University of Nevada, Las Vegas; the authors initially provide: “Employee group performance variation accounts for a significant portion of the degree of productivity in the hotel, motel, and food service sectors of the hospitality industry. The authors discuss TEAMSG, a microcomputer based approach to appraising and interpreting group performance. TEAMSG appraisal allows an organization to profile and to evaluate groups, facilitating the targeting of training and development decisions and interventions, as well as the more equitable distribution of organizational rewards.” “The caliber of employee group performance is a major determinant in an organization's productivity and success within the hotel and food service industries,” Edwards and Cummings say. “Gaining accurate information about the quality of performance of such groups as organizational divisions, individual functional departments, or work groups can be as enlightening...” the authors further reveal. This perspective is especially important not only for strategic human resources planning purposes, but also for diagnosing development needs and for differentially distributing organizational rewards.” The authors will have you know, employee requirements in an unpredictable environment, which is what the hospitality industry largely is, are difficult to quantify. In an effort to measure elements of performance Edwards and Cummings look to TEAMSG, which is an acronym for Team Evaluation and Management System for Groups. They develop the concept. In discussing background for employees, Edwards and Cummings point-out that employees - at the individual level - must often possess and exercise varied skills. In group circumstances employees often work at locations outside of, or move from corporate unit-to-unit, as in the case of a project team. Being able to transcend individual-to-group mentality is imperative. “A solution which addresses the frustration and lack of motivation on the part of the employee is to coach, develop, appraise, and reward employees on the basis of group achievement,” say the authors. “An appraisal, effectively developed and interpreted, has at least three functions,” Edwards and Cummings suggest, and go on to define them. The authors do place a great emphasis on rewards and interventions to bolster the assertion set forth in their thesis statement. Edwards and Cummings warn that individual agendas can threaten, erode, and undermine group performance; there is no - I - in TEAM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microcomputers are emerging as a potent source for decision- making for hotel management. This article provides the hotel executive with a short course on the possibilities of microcomputers in his or her everyday work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last several years there has been an increase in the amount of qualitative research using in-depth interviews and comprehensive content analyses in sport psychology. However, no explicit method has been provided to deal with the large amount of unstructured data. This article provides common guidelines for organizing and interpreting unstructured data. Two main operations are suggested and discussed: first, coding meaningful text segments, or creating tags, and second, regrouping similar text segments,or creating categories. Furthermore, software programs for the microcomputer are presented as away to facilitate the organization and interpretation of qualitative data

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Mecânica, 2015.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A purpose of this research study was to demonstrate the practical linguistic study and evaluation of dissertations by using two examples of the latest technology, the microcomputer and optical scanner. That involved developing efficient methods for data entry plus creating computer algorithms appropriate for personal, linguistic studies. The goal was to develop a prototype investigation which demonstrated practical solutions for maximizing the linguistic potential of the dissertation data base. The mode of text entry was from a Dest PC Scan 1000 Optical Scanner. The function of the optical scanner was to copy the complete stack of educational dissertations from the Florida Atlantic University Library into an I.B.M. XT microcomputer. The optical scanner demonstrated its practical value by copying 15,900 pages of dissertation text directly into the microcomputer. A total of 199 dissertations or 72% of the entire stack of education dissertations (277) were successfully copied into the microcomputer's word processor where each dissertation was analyzed for a variety of syntax frequencies. The results of the study demonstrated the practical use of the optical scanner for data entry, the microcomputer for data and statistical analysis, and the availability of the college library as a natural setting for text studies. A supplemental benefit was the establishment of a computerized dissertation corpus which could be used for future research and study. The final step was to build a linguistic model of the differences in dissertation writing styles by creating 7 factors from 55 dependent variables through principal components factor analysis. The 7 factors (textual components) were then named and described on a hypothetical construct defined as a continuum from a conversational, interactional style to a formal, academic writing style. The 7 factors were then grouped through discriminant analysis to create discriminant functions for each of the 7 independent variables. The results indicated that a conversational, interactional writing style was associated with more recent dissertations (1972-1987), an increase in author's age, females, and the department of Curriculum and Instruction. A formal, academic writing style was associated with older dissertations (1972-1987), younger authors, males, and the department of Administration and Supervision. It was concluded that there were no significant differences in writing style due to subject matter (community college studies) compared to other subject matter. It was also concluded that there were no significant differences in writing style due to the location of dissertation origin (Florida Atlantic University, University of Central Florida, Florida International University).