812 resultados para KM technology-centred approach,


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Electronic Patient Record (EPR) is being developed by many hospitals in the UK and across the globe. We class an EPR system as a type of Knowledge Management System (KMS), in that it is a technological tool developed to support the process of knowledge management (KM). Healthcare organisations aim to use these systems to provide a vehicle for more informed and improved clinical decision making thereby delivering reduced errors and risks, enhanced quality and consequently offering enhanced patient safety. Finding an effective way for a healthcare organisation to practically implement these systems is essential. In this study we use the concept of the business process approach to KM as a theoretical lens to analyse and explore how a large NHS teaching hospital developed, executed and practically implemented an EPR system. This theory advocates the importance of taking into account all organizational activities - the business processes - in considering any KM initiatives. Approaching KM through business processes allows for a more holistic view of the requirements across a process: emphasis is placed on how particular activities are performed, how they are structured and what knowledge demanded and not just supplied across each process. This falls in line with the increased emphasis in healthcare on patient-centred approaches to care delivery. We have found in previous research that hospitals are happy with the delivery of patient care being referred to as their 'business'. A qualitative study was conducted over a two and half year period with data collected from semi-structured interviews with eight members of the strategic management team, 12 clinical users and 20 patients in addition to non- participant observation of meetings and documentary data. We believe that the inclusion of patients within the study may well be the first time this has been done in examining the implementation of a KMS. The theoretical propositions strategy was used as the overarching approach for data analysis. Here Initial theoretical research themes and propositions were used to help shape and organise the case study analysis. This paper will present preliminary findings about the hospital's business strategy and its links to the KMS strategy and process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Contemporary web-based software solutions are usually composed of many interoperating applications. Classical approach is the different applications of the solution to be created inside one technology/platform, e.g. Java-technology, .NET-technology, etc. Wide spread technologies/platforms practically discourage (and sometime consciously make impossible) the cooperation with elements of the concurrent technologies/platforms. To make possible the usage of attractive features of one technology/platform in another technology/platform some “cross-technology” approach is necessary. In the paper is discussed the possibility to combine two existing instruments – interoperability protocols and “lifting” of procedures – in order to obtain such cross-technology approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Technology-Enhanced Learning in Higher Education is an anthology produced by the international association, Learning in Higher Education (LiHE). LiHE, whose scope includes the activities of colleges, universities and other institutions of higher education, has been one of the leading organisations supporting a shift in the education process from a transmission-based philosophy to a student-centred, learning-based approach. Traditionally education has been envisaged as a process in which the teacher disseminates knowledge and information to the student, and directs them to perform – instructing, cajoling, encouraging them as appropriate – despite different students’ abilities. Yet higher education is currently experiencing rapid transformation, with the introduction of a broad range of technologies which have the potential to enhance student learning. This anthology draws upon the experiences of those practitioners who have been pioneering new applications of technology in higher education, highlighting not only the technologies themselves but also the impact which they have had on student learning. The anthology illustrates how new technologies – which are increasingly well-known and accepted by today’s ‘digital natives’ undertaking higher education – can be adopted and incorporated. One key conclusion is that learning remains a social process even in technology-enhanced learning contexts. So the technology-based proxies we construct need to retain and reflect the agency of the teacher. Technology-Enhanced Learning in Higher Education showcases some of the latest pedagogical technologies and their most creative, state-of-the-art applications to learning in higher education from around the world. Each of the chapters explores technology-enhanced learning in higher education in terms of either policy or practice. They contain detailed descriptions of approaches taken in very different curriculum areas, and demonstrate clearly that technology may and can enhance learning only if it is designed with the learning process of students at its core. So the use of technology in education is more linked to pedagogy than it is to bits and bytes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Economic policy-making has long been more integrated than social policy-making in part because the statistics and much of the analysis that supports economic policy are based on a common conceptual framework – the system of national accounts. People interested in economic analysis and economic policy share a common language of communication, one that includes both concepts and numbers. This paper examines early attempts to develop a system of social statistics that would mirror the system of national accounts, particular the work on the development of social accounts that took place mainly in the 60s and 70s. It explores the reasons why these early initiatives failed but argues that the preconditions now exist to develop a new conceptual framework to support integrated social statistics – and hence a more coherent, effective social policy. Optimism is warranted for two reasons. First, we can make use of the radical transformation that has taken place in information technology both in processing data and in providing wide access to the knowledge that can flow from the data. Second, the conditions exist to begin to shift away from the straight jacket of government-centric social statistics, with its implicit assumption that governments must be the primary actors in finding solutions to social problems. By supporting the decision-making of all the players (particularly individual citizens) who affect social trends and outcomes, we can start to move beyond the sterile, ideological discussions that have dominated much social discourse in the past and begin to build social systems and structures that evolve, almost automatically, based on empirical evidence of ‘what works best for whom’. The paper describes a Canadian approach to developing a framework, or common language, to support the evolution of an integrated, citizen-centric system of social statistics and social analysis. This language supports the traditional social policy that we have today; nothing is lost. However, it also supports a quite different social policy world, one where individual citizens and families (not governments) are seen as the central players – a more empirically-driven world that we have referred to as the ‘enabling society’.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Given recent demands for more co-creational university technology commercialisation
processes involving industry and end users, this paper adopts a micro level approach to explore
the challenges faced by universities when managing quadruple helix stakeholders within the
technology commercialisation processes. To explore this research question, a qualitative
research methodology which relies upon comparative case analysis was adopted to explore the
technology commercialisation process in two universities within a UK region. The findings
revealed that university type impacts Quadruple Helix stakeholder salience and engagement
and consequently university technology commercialisation activities and process. This is
important as recent European regional policy fails to account for contextual influences when
promoting Quadruple Helix stakeholder relationships in co-creational university technology
commercialisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a combined hydro-kinetic approach which incorporates a hydrodynamical expansion of the systems formed in A + A collisions and their dynamical decoupling described by escape probabilities. The method corresponds to a generalized relaxation time (tau(rel)) approximation for the Boltzmann equation applied to inhomogeneous expanding systems; at small tau(rel) it also allows one to catch the viscous effects in hadronic component-hadron-resonance gas. We demonstrate how the approximation of sudden freeze-out can be obtained within this dynamical picture of continuous emission and find that hypersurfaces, corresponding to a sharp freeze-out limit, are momentum dependent. The pion m(T) spectra are computed in the developed hydro-kinetic model, and compared with those obtained from ideal hydrodynamics with the Cooper-Frye isothermal prescription. Our results indicate that there does not exist a universal freeze-out temperature for pions with different momenta, and support an earlier decoupling of higher p(T) particles. By performing numerical simulations for various initial conditions and equations of state we identify several characteristic features of the bulk QCD matter evolution preferred in view of the current analysis of heavy ion collisions at RHIC energies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper was to study a method based on gas production technique to measure the biological effects of tannins on rumen fermentation. Six feeds were used as fermentation substrates in a semi-automated gas method: feed A - aroeira (Astronium urundeuva); feed B - jurema preta (Mimosa hostilis), feed C - sorghum grains (Sorghum bicolor); feed D - Tifton-85 (Cynodon sp.); and two others prepared mixing 450 g sorghum leaves, 450 g concentrate (maize and soybean meal) and 100 g either of acacia (Acacia mearnsii) tannin extract (feed E) or quebracho (Schinopsis lorentzii) tannin extract (feed F) per kg (w:w). Three assays were carried out to standardize the bioassay for tannins. The first assay compared two binding agents (polyethylene glycol - PEG - and polyvinyl polypirrolidone - PVPP) to attenuate the tannin effects. The complex formed by PEG and tannins showed to be more stable than PVPP and tannins. Then, in the second assay, PEG was used as binding agent, and this assay was done to evaluate levels of PEG (0, 500, 750, 1000 and 1250 mg/g DM) to minimize the tannin effect. All the tested levels of PEG produced a response to evaluate tannin effects but the best response was for dose of 1000 mg/g DM. Using this dose of PEG, the final assay was carried out to test three compounds (tannic acid, quebracho extract and acacia extract) to establish a curve of biological equivalent effect of tannins. For this, five levels of each compound were added to I g of a standard feed (Lucerne hay). The equivalent effect showed not to be directly related to the chemical analysis for tannins. It was shown that different sources of tannins had different activities or reactivities. The curves of biological equivalence can provide information about tannin reactivity and its use seems to be important as an additional factor for chemical analysis. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrodeposition of thin copper layer was carried out on titanium wires in acidic sulphate bath. The influence of titanium surface preparation, cathodic current density, copper sulphate and sulphuric acid concentrations, electrical charge density and stirring of the solution on the adhesion of the electrodeposits was studied using the Taguchi statistical method. A L(16) orthogonal array with the six factors of control at two levels each and three interactions was employed. The analysis of variance of the mean adhesion response and signal-to-noise ratio showed the great influence of cathodic current density on adhesion. on the contrary, the other factors as well as the three investigated interactions revealed low or no significant effect. From this study optimized electrolysis conditions were defined. The copper electrocoating improved the electrical conductivity of the titanium wire. This shows that copper electrocoated titanium wires could be employed for both electrical purpose and mechanical reinforcement in superconducting magnets. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a statistical study on the variability of the mechanical properties of hardened self-compacting concrete, including the compressive strength, splitting tensile strength and modulus of elasticity. The comparison of the experimental results with those derived from several codes and recommendations allows evaluating if the hardened behaviour of self-compacting concrete can be appropriately predicted by the existing formulations. The variables analyzed include the maximum size aggregate, paste and gravel content. Results from the analyzed self-compacting concretes presented variability measures in the same range than the expected for conventional vibrated concrete, with all the results within a confidence level of 95%. From several formulations for conventional concrete considered in this study, it was observed that a safe estimation of the modulus of elasticity can be obtained from the value of compressive strength; with lower strength self-compacting concretes presenting higher safety margins. However, most codes overestimate the material tensile strength. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A broader characterization of industrial wastewaters, especially in respect to hazardous compounds and their potential toxicity, is often necessary in order to determine the best practical treatment (or pretreatment) technology available to reduce the discharge of harmful pollutants to the environment or publicly owned treatment works. Using a toxicity-directed approach, this paper sets the base for a rational treatability study of polyester resin manufacturing. Relevant physical and chemical characteristics were determined. Respirometry was used for toxicity reduction evaluation after physical and chemical effluent fractionation. Of all the procedures investigated, only air stripping was significantly effective in reducing wastewater toxicity. Air stripping in pH 7 reduced toxicity in 18.2%, while in pH 11 a toxicity reduction of 62.5% was observed. Results indicated that toxicants responsible for the most significant fraction of the effluent`s instantaneous toxic effect to unadapted activated sludge were organic compounds poorly or not volatilized in acid conditions. These results led to useful directions for conducting treatability studies which will be grounded on actual effluent properties rather than empirical or based on the rare specific data on this kind of industrial wastewater. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.