848 resultados para Computer programs -- Development
Resumo:
At head of cover title: Generalized computer programs.
Resumo:
The increasing use of information and communications technologies among government departments and non-government agencies has fundamentally changed the implementation of employment services policy in Australia. The administrative arrangements for governing unemployment and unemployed people are now constituted by a complex contractual interplay between government departments as ‘purchasers’ and a range of small and large private organizations as ‘providers’. Assessing, tracking and monitoring the activities of unemployed people through the various parts of the employment services system has been made possible by developments in information technology and tailored computer programs. Consequently, the discretionary capacity that is traditionally associated with ‘street-level bureaucracy’ has been partly transformed into more prescriptive forms of ‘screen-level bureaucracy’. The knowledge embedded in these new computer-based technologies is considered superior because it is based on ‘objective calculations’, rather than subjective assessments of individual employees. The relationship between the sociopolitical context of unemployment policy and emerging forms of e-government is explored using illustrative findings from a qualitative pilot study undertaken in two Australian sites. The findings suggest that some of the new technologies in the employment services system are welcomed, while other applications are experienced as contradictory to the aims of delivering a personalized and respectful service.
Resumo:
The thesis is concerned with the development and testing of a mathematical model of a distillation process in which the components react chemically. The formaldehyde-methanol-water system was selected and only the reversible reactions between formaldehyde and water giving methylene glycol and between formaldehyde and methanol producing hemiformal were assumed to occur under the distillation conditions. Accordingly the system has been treated as a five component system. The vapour-liquid equilibrium calculations were performed by solving iteratively the thermodynamic relationships expressing the phase equilibria with the stoichiometric equations expressing the chemical equilibria. Using optimisation techniques, the Wilson single parameters and Henry's constants were calculated for binary systems containing formaldehyde which was assumed to be a supercritical component whilst Wilson binary parameters were calculated for the remaining binary systems. Thus the phase equilibria for the formaldehyde system could be calculated using these parameters and good accuracy was obtained when calculated values were compared with experimental values. The distillation process was modelled using the mass and energy balance equations together with the phase equilibria calculations. The plate efficiencies were obtained from a modified A.I.Ch.E. Bubble Tray method. The resulting equations were solved by an iterative plate to plate calculation based on the Newton Raphson method. Experiments were carried out in a 76mm I.D., eight sieve plate distillation column and the results were compared with the mathematical model calculations. Overall, good agreement was obtained but some discrepancies were observed in the concentration profiles and these may have been caused by the effect of limited physical property data and a limited understanding of the reactions mechanism. The model equations were solved in the form of modular computer programs. Although they were written to describe the steady state distillation with simultaneous chemical reaction of the formaldehyde system, the approach used may be of wider application.
Resumo:
In the last few years, there has been considerable interest in using saturated magnetic objective lenses in high resolution electron microscopes. Such lenses, in present commercial electron microscopes, are energized either by conventional or superconducting coils. Very little work, however, has been reported on the use of conventional coils in saturated magnetic electron lenses. The present investigation has been concerned with the design of high flux density saturated objective lenses of both single and double polepiece types which may be energized by conventional coils and in some cases by superconducting coils. Such coils have the advantage of being small and capable of carrying high current densities. The present work has been carried out with the aid of several computer programs based on the finite element method. The effect of the shape and position of the energizing coil on the electron optical parameter has been investigated. Electron optical properties such as chromatic and spherical aberration have been studies in detail for saturated single and double polepiece lenses. Several high flux density coils of different shapes have been investigated. The choice of the most favourable coil shape and position subject to the operational requirements, has been studied in some detail. The focal properties of such optimised lenses have been computed and compared.
Resumo:
Cold roll forming of thin-walled sections is a very useful process in the sheet metal industry. However, the conventional method for the design and manufacture of form-rolls, the special tooling used in the cold roll forming process, is a very time consuming and skill demanding exercise. This thesis describes the establishment of a stand-alone minicomputer based CAD/CAM system for assisting the design and manufacture of form-rolls. The work was undertaken in collaboration with a leading manufacturer of thin-walled sections. A package of computer programs have been developed to provide computer aids for every aspect of work in form-roll design and manufacture. The programs have been successfully implemented, as an integrated CAD/CAM software system, on the ICL PERQ minicomputer with graphics facilities. Thus, the developed CAD/CAM system is a single-user workstation, with software facilities to help the user to perform the conventional roll design activities including the design of the finished section, the flower pattern, and the form-rolls. A roll editor program can then be used to modify, if required, the computer generated roll profiles. As far as manufacturing is concerned, a special-purpose roll machining program and postprocessor can be used in conjunction to generate the NC control part-programs for the production of form-rolls by NC turning. Graphics facilities have been incorporated into the CAD/CAM software programs to display drawings interactively on the computer screen throughout all stages of execution of the CAD/CAM software. It has been found that computerisation can shorten the lead time in all activities dealing with the design and manufacture of form-rolls, and small or medium size manufacturing companies can gain benefits from the CAD/CM! technology by developing, according to its own specification, a tailor-made CAD/CAM software system on a low cost minicomputer.
Resumo:
Myopia is a refractive condition and develops because either the optical power of the eye is abnormally great or the eye is abnormally long, the optical consequences being that the focal length of the eye is too short for the physical length of the eye. The increase in axial length has been shown to match closely the dioptric error of the eye, in that a lmm increase in axial length usually generates 2 to 3D of myopia. The most common form of myopia is early-onset myopia (EO M) which occurs between 6 to 14 years of age. The second most common form of myopia is late-onset myopia (LOM) which emerges in late teens or early twenties, at a time when the eye should have ceased growing. The prevalence of LOM is increasing and research has indicated a link with excessive and sustained nearwork. The aim of this thesis was to examine the ocular biometric correlates associated with LOM and EOM development and progression. Biometric data was recorded on SO subjects, aged 16 to 26 years. The group was divided into 26 emmetropic subjects and 24 myopic subjects. Keratometry, corneal topography, ultrasonography, lens shape, central and peripheral refractive error, ocular blood flow and assessment of accommodation were measured on three occasions during an ISmonth to 2-year longitudinal study. Retinal contours were derived using a specially derived computer program. The thesis shows that myopia progression is related to an increase in vitreous chamber depth, a finding which supports previous work. The myopes exhibited hyperopic relative peripheral refractive error (PRE) and the emmetropes exhibited myopic relative PRE. Myopes demonstrated a prolate retinal shape and the retina became more prolate with myopia progression. The results show that a longitudinal, rather than equatorial, increase in the posterior segment is the principal structural correlate of myopia. Retinal shape, relative PRE and the ratio of axial length to corneal curvature have been indicated, in this thesis, as predictive factors for myopia onset and development. Data from this thesis demonstrates that myopia progression in the LOM group is the result of an increase in anterior segment power, owing to an increase in lens thickness, in conjunction with posterior segment elongation. Myopia progression in the EOM group is the product of a long posterior segment, which over-compensates for a weak anterior segment power. The weak anterior segment power in the EOM group is related to a combination of crystalline lens thinning and surface flattening. The results presented in this thesis confirm that posterior segment elongation is the main structural correlate in both EOM and LOM progression. The techniques and computer programs employed in the thesis are reproducible and robust providing a valuable framework for further myopia research and assessment of predictive factors.
Resumo:
This work is concerned with the development of techniques for the evaluation of large-scale highway schemes with particular reference to the assessment of their costs and benefits in the context of the current transport planning (T.P.P.) process. It has been carried out in close cooperation with West Midlands County Council, although its application and results are applicable elsewhere. The background to highway evaluation and its development in recent years has been described and the emergence of a number of deficiencies in current planning practise noted. One deficiency in particular stood out, that stemming from inadequate methods of scheme generation and the research has concentrated upon improving this stage of appraisal, to ensure that subsequent stages of design, assessment and implementation are based upon a consistent and responsive foundation. Deficiencies of scheme evaluation were found to stem from inadequate development of appraisal methodologies suffering from difficulties of valuation, measurement and aggregation of the disparate variables that characterise highway evaluation. A failure to respond to local policy priorities was also noted. A 'problem' rather than 'goals' based approach to scheme generation was taken, as it represented the current and foreseeable resource allocation context more realistically. A review of techniques with potential for highway problem based scheme generation, which would work within a series of practical and theoretical constraints were assessed and that of multivariate analysis, and classical factor analysis in particular, was selected, because it offerred considerable application to the difficulties of valuation, measurement and aggregation that existed. Computer programs were written to adapt classical factor analysis to the requirements of T.P.P. highway evaluation, using it to derive a limited number of factors which described the extensive quantity of highway problem data. From this, a series of composite problem scores for 1979 were derived for a case study area of south Birmingham, based upon the factorial solutions, and used to assess highway sites in terms of local policy issues. The methodology was assessed in the light of its ability to describe highway problems in both aggregate and disaggregate terms, to guide scheme design, coordinate with current scheme evaluation methods, and in general to improve upon current appraisal. Analysis of the results was both in subjective, 'common-sense' terms and using statistical methods to assess the changes in problem definition, distribution and priorities that emerged. Overall, the technique was found to improve upon current scheme generation methods in all respects and in particular in overcoming the problems of valuation, measurement and aggregation without recourse to unsubstantiated and questionable assumptions. A number of deficiencies which remained have been outlined and a series of research priorities described which need to be reviewed in the light of current and future evaluation needs.
Resumo:
This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the reusability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective, providing new features and enriching the mobile user’s experience through a broad scope of potential applications.
Resumo:
Distributed Generation (DG) from alternate sources and smart grid technologies represent good solutions for the increase in energy demands. Employment of these DG assets requires solutions for the new technical challenges that are accompanied by the integration and interconnection into operational power systems. A DG infrastructure comprised of alternate energy sources in addition to conventional sources, is developed as a test bed. The test bed is operated by synchronizing, wind, photovoltaic, fuel cell, micro generator and energy storage assets, in addition to standard AC generators. Connectivity of these DG assets is tested for viability and for their operational characteristics. The control and communication layers for dynamic operations are developed to improve the connectivity of alternates to the power system. A real time application for the operation of alternate sources in microgrids is developed. Multi agent approach is utilized to improve stability and sequences of actions for black start are implemented. Experiments for control and stability issues related to dynamic operation under load conditions have been conducted and verified.
Resumo:
One of the major problems in the analysis of beams with Moment of Inertia varying along their length, is to find the Fixed End Moments, Stiffness, and Carry-Over Factors. In order to determine Fixed End Moments, it is necessary to consider the non-prismatic member as integrated by a large number of small sections with constant Moment of Inertia, and to find the M/EI values for each individual section. This process takes a lot of time from Designers and Structural Engineers. The object of this thesis is to design a computer program to simplify this repetitive process, obtaining rapidly and effectively the Final Moments and Shears in continuous non-prismatic Beams. For this purpose the Column Analogy and the Moment Distribution Methods of Professor Hardy Cross have been utilized as the principles toward the methodical computer solutions. The program has been specifically designed to analyze continuous beams of a maximum of four spans of any length, integrated by symmetrical members with rectangular cross sections and with rectilinear variation of the Moment of Inertia. Any load or combination of uniform and concentrated loads must be considered. Finally sample problems will be solved with the new Computer Program and with traditional systems, to determine the accuracy and applicability of the Program.
Resumo:
Este estudo incide sobre as características que a presença do ião flúor em moléculas concede. Mais concretamente em fluoroquinolonas, antibióticos que cada vez são mais utilizados. Fez-se uma analise de vários parâmetros para obtermos informação sobre a interação fármaco-receptor nas fluoroquinolonas. Sendo para isso utilizadas técnicas de caracterização química computacional para conseguirmos caracterizar eletronicamente e estruturalmente (3D) as fluoroquinolonas em complemento aos métodos semi-empíricos utilizados inicialmente. Como é sabido, a especificidade e a afinidade para o sitio alvo, é essencial para eficácia de um fármaco. As fluoroquinolonas sofreram um grande desenvolvimento desde a primeira quinolona sintetizada em 1958, sendo que desde ai foram sintetizadas inúmeros derivados da mesma. Este facto deve-se a serem facilmente manipuladas, derivando fármacos altamente potentes, espectro alargado, factores farmacocinéticos optimizados e efeitos adversos reduzidos. A grande alteração farmacológica para o aumento do interesse neste grupo, foi a substituição em C6 de um átomo de flúor em vez de um de hidrogénio. Para obtermos as informações sobre a influência do ião flúor sobre as propriedades estruturais e electrónicas das fluoroquinolonas, foi feita uma comparação entre a fluoroquinolona com flúor em C6 e com hidrogénio em C6. As quatro fluoroquinolonas presentes neste estudo foram: ciprofloxacina, moxiflocacina, sparfloxacina e pefloxacina. As informações foram obtidas por programas informáticos de mecânica quântica e molecular. Concluiu-se que a presença de substituinte flúor não modificava de forma significativa a geometria das moléculas mas sim a distribuição da carga no carbono vicinal e nos átomos em posição alfa, beta e gama relativamente a este. Esta modificação da distribuição electrónica pode condicionar a ligação do fármaco ao receptor, modificando a sua actividade farmacológica.
Resumo:
Force plate or pressure plate analysis came as an innovative tool to biomechanics and sport medicine -- This allows engineers, scientists and doctors to virtually reconstruct the way a person steps while running or walking using a measuring system and a computer -- With this information they can calculate and analyze a whole set of variables and factors that characterize the step -- Then they are able to make corrections and/or optimizations, designing appropriate shoes and insoles for the patient -- The idea is to study and understand all the hardware and software implications of this process and all the components involved, and then propose an alternative solution -- This solution should have at least similar performance to existing systems -- It should increase the accuracy and/or the sampling frequency to obtain better results -- By the end, there should be a working prototype of a pressure measuring system and a mathematical model to govern it -- The costs of the system have to be lower than most of the systems in the market
Resumo:
Power generation from alternative sources is at present the subject of numerous research and development in science and industry. Wind energy stands out in this scenario as one of the most prominent alternative in the generation of electricity, by its numerous advantages. In research works, computer reproduction and experimental behavior of a wind turbine are very suitable tools for the development and study of new technologies and the use of wind potential of a given region. These tools generally are desired to include simulation of mechanical and electrical parameters that directly affect the energy conversion. This work presents the energy conversion process in wind systems for power generation, in order to develop a tool for wind turbine emulation testing experimental, using LabVIEW® software. The purpose of this tool is to emulate the torque developed in an axis wind turbine. The physical setup consists of a three phase induction motor and a permanent magnet synchronous generator, which are evaluated under different wind speed conditions. This tool has the objective to be flexible to other laboratory arrangements, and can be used in other wind power generation structures in real time. A modeling of the wind power system is presented, from the turbine to the electrical generator. A simulation tool is developed using Matlab/Simulink® with the purpose to pre-validate the experiment setup. Finally, the design is implemented in a laboratory setup.
Resumo:
In this preliminary study eighteen p-substituted benzoic acid [(5-nitro-thiophen-2-yl)-methylene]-hydrazides with antimicrobial activity were evaluated against multidrug-resistant Staphylococcus aureus, correlating the three-dimensional characteristics of the ligands with their respective bioactivities. The computer programs Sybyl and CORINA were used, respectively, for the design and three-dimensional conversion of the ligands. Molecular interaction fields were calculated using GRID program. Calculations using Volsurf resulted in a statistically consistent model with 48 structural descriptors showing that hydrophobicity is a fundamental property in the analyzed biological response.
Impact of Commercial Search Engines and International Databases on Engineering Teaching and Research
Resumo:
For the last three decades, the engineering higher education and professional environments have been completely transformed by the "electronic/digital information revolution" that has included the introduction of personal computer, the development of email and world wide web, and broadband Internet connections at home. Herein the writer compares the performances of several digital tools with traditional library resources. While new specialised search engines and open access digital repositories may fill a gap between conventional search engines and traditional references, these should be not be confused with real libraries and international scientific databases that encompass textbooks and peer-reviewed scholarly works. An absence of listing in some Internet search listings, databases and repositories is not an indication of standing. Researchers, engineers and academics should remember these key differences in assessing the quality of bibliographic "research" based solely upon Internet searches.