834 resultados para Design and Technology, Professional Development, Curriculum Implementation
Resumo:
This paper investigates how government policy directions embracing deregulation and market liberalism, together with significant pre-existing tensions within the Australian medical profession, produced ground breaking change in the funding and delivery of medical education for general practitioners. From an initial view between and within the medical profession, and government, about the goal of improving the standards of general practice education and training, segments of the general practice community, particularly those located in rural and remote settings, displayed increasingly vocal concerns about the approach and solutions proffered by the predominantly urban-influenced Royal Australian College of General Practitioners (RACGP). The extent of dissatisfaction culminated in the establishment of the Australian College of Rural and Remote Medicine (ACRRM) in 1997 and the development of an alternative curriculum for general practice. This paper focuses on two decades of changes in general practice training and how competition policy acted as a justificatory mechanism for putting general practice education out to competitive tender against a background of significant intra-professional conflict. The government's interest in increasing efficiency and deregulating the 'closed shop' practices of professions, as expressed through national competition policy, ultimately exposed the existing antagonisms within the profession to public view and allowed the government some influence on the sacred cow of professional training. Government policy has acted as a mechanism of resolution for long standing grievances of the rural GPs and propelled professional training towards an open competition model. The findings have implications for future research looking at the unanticipated outcomes of competition and internal markets.
Resumo:
Email has been used for some years as a low-cost telemedicine medium to provide support for developing countries. However, all operations have been relatively small scale and fairly labour intensive to administer. A scalable, automatic message-routing system was constructed which automates many of the tasks. During a four-month study period in 2002, 485 messages were processed automatically. There were 31 referrals from eight hospitals in three countries. These referrals were handled by 25 volunteer specialists from a panel of 42. Two system operators, located 10 time zones apart, managed the system. The median time from receipt of a new referral to its allocation to a specialist was 1.0 days (interquartile range 0.7-2.4). The median interval between allocation and first reply was 0.7 days (interquartile range 0.3-2.3). Automatic message handling solves many of the problems of manual email telemedicine systems and represents a potentially scalable way of doing low-cost telemedicine in the developing world.
Resumo:
The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.
Resumo:
Product design and sourcing decisions are among the most difficult and important of all decisions facing multinational manufacturing companies, yet associated decision support and evaluation systems tend to be myopic in nature. Design for manufacture and assembly techniques, for example, generally focuses on manufacturing capability and ignores capacity although both should be considered. Similarly, most modelling and evaluation tools available to examine the performance of various solution and improvement techniques have a narrower scope than desired. A unique collaboration, funded by the US National Science Foundation, between researchers in the USA and the UK currently addresses these problems. This paper describes a technique known as Design For the Existing Environment (DFEE) and an holistic evaluation system based on enterprise simulation that was used to demonstrate the business benefits of DFEE applied in a simple product development and manufacturing case study. A project that will extend these techniques to evaluate global product sourcing strategies is described along with the practical difficulties of building an enterprise simulation on the scale and detail required.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
Advances in both computer technology and the necessary mathematical models capable of capturing the geometry of arbitarily shaped objects has led to the development in this thesis of a surface generation package called 'IBSCURF' aimed at providing a more economically viable solution to free-form surface manufacture. A suit of computer programs written in FORTRAN 77 has been developed to provide computer aids for every aspect of work in designing and machining free-form surfaces. A vector-valued parametric method was used for shape description and a lofting technique employed for the construction of the surface. The development of the package 'IBSCURF' consists of two phases. The first deals with CAD. The design process commences in defining the cross-sections which are represented by uniform B-spline curves as approximations to give polygons. The order of the curve and the position and number of the polygon vertices can be used as parameters for the modification to achieve the required curves. When the definitions of the sectional curves is complete, the surface is interpolated over them by cubic cardinal splines. To use the CAD function of the package to design a mould for a plastic handle, a mathematical model was developed. To facilitate the integration of design and machining using the mathematical representation of the surface, the second phase of the package is concerned with CAM which enables the generation of tool offset positions for ball-nosed cutters and a general post-processor has been developed which automatically generates NC tape programs for any CNC milling machine. The two phases of these programs have been successfully implemented, as a CAD/CAM package for free-form surfaces on the VAX 11/750 super-minicomputer with graphics facilities for displaying drawings interactively on the terminal screen. The development of this package has been beneficial in all aspects of design and machining of free form surfaces.
Resumo:
The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.
Resumo:
Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.
Resumo:
Communication and portability are the two main problems facing the user. An operating system, called PORTOS, was developed to solve these problems for users on dedicated microcomputer systems. Firstly, an interface language was defined, according to the anticipated requirements and behaviour of its potential users. Secondly, the PORTOS operating system was developed as a processor for this language. The system is currently running on two minicomputers of highly different architectures. PORTOS achieves its portability through its high-level design, and implementation in CORAL66. The interface language consists of a set of user cotnmands and system responses. Although only a subset has been implemented, owing to time and manpower constraints, promising results were achieved regarding the usability of the language, and its portability.
Resumo:
INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.
Resumo:
In recent years, mobile technology has been one of the major growth areas in computing. Designing the user interface for mobile applications, however, is a very complex undertaking which is made even more challenging by the rapid technological developments in mobile hardware. Mobile human-computer interaction, unlike desktop-based interaction, must be cognizant of a variety of complex contextual factors affecting both users and technology. The Handbook of Research on User Interface Design and Evaluation provides students, researchers, educators, and practitioners with a compendium of research on the key issues surrounding the design and evaluation of mobile user interfaces, such as the physical environment and social context in which a mobile device is being used and the impact of multitasking behavior typically exhibited by mobile-device users. Compiling the expertise of over 150 leading experts from 26 countries, this exemplary reference tool will make an indispensable addition to every library collection.
Resumo:
Cationic liposomes have been extensively explored for their efficacy in delivering nucleic acids, by offering the ability to protect plasmid DNA against degradation, promote gene expression and, in the case of DNA vaccines, induce both humoural and cellular immune responses. DNA vaccines may also offer advantages in terms of safety, but they are less effective and need an adjuvant to enhance their immunogenicity. Therefore, cationic liposomes can be utilised as delivery systems and/or adjuvants for DNA vaccines to stimulate stronger immune responses. To explore the role of liposomal systems within plasmid DNA delivery, parameters such as the effect of lipid composition, method of liposome preparation and presence of electrolytes in the formulation were investigated in characterisation studies, in vitro transfection studies and in vivo biodistribution and immunisation studies. Liposomes composed of 1,2-dioleoyl-sn-glycero 3-phosphoethanolamine (DOPE) in combination with 1,2-dioleoyl-3-trimethylammonium-propane (DOTAP) or 1,2-stearoyl-3- trimethylammonium-propane (DSTAP) were prepared by the lipid hydration method and hydrated in aqueous media with or without presence of electrolytes. Whilst the in vitro transfection efficiency of all liposomes resulted to be higher than Lipofectin, DSTAP-based liposomes showed significantly higher transfection efficiency than DOTAP-based formulations. Furthermore, upon intramuscular injection of liposomal DNA vaccines, DSTAP-based liposomes showed a significantly stronger depot effect at the injection site. This could explain the result of heterologous immunisation studies, which revealed DSTAP-based liposomal vaccines induce stronger immune responses compared to DOTAP-based formulations. Previous studies have shown that having more liposomally associated antigen at the injection site would lead to more drainage of them into the local lymph nodes. Consequently, this would lead to more antigens being presented to antigen presenting cells, which are circulating in lymph nodes, and this would initiate a stronger immune response. Finally, in a comparative study, liposomes composed of dimethyldioctadecylammonium bromide (DDA) in combination with DOPE or immunostimulatory molecule of trehalose 6,6-dibehenate (TDB) were prepared and investigated in vitro and in vivo. Results showed that although DDA:TDB is not able to transfect the cells efficiently in vitro, this formulation induces stronger immunity compared to DDA:DOPE due to the immunostimulatory effects of TDB. This study demonstrated, while the presence of electrolytes did not improve immune responses, small unilamellar vesicle (SUV) liposomes induced stronger humoural immune responses compared to dehydration rehydration vesicle (DRV) liposomes. Moreover, lipid composition was shown to play a key role in in vitro and in vivo behaviour of the formulations, as saturated cationic lipids provided stronger immune responses compared to unsaturated lipids. Finally, heterologous prime/boost immunisation promoted significantly stronger immune responses compared to homologous vaccination of DNA vaccines, however, a single immunisation of subunit vaccine provoked comparable levels of immune response to the heterologous regimen, suggesting more immune efficiency for subunit vaccines compared to DNA vaccines.