934 resultados para Computer Knowledge Bank on Medical Diagnostics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research is to investigate how risk management in a healthcare organisation can be supported by knowledge management. The subject of research is the development and management of existing logs called "risk registers", through specific risk management processes employed in a N.H.S. (Foundation) Trust in England, in the U.K. Existing literature on organisational risk management stresses the importance of knowledge for the effective implementation of risk management programmes, claiming that knowledge used to perceive risk is biased by the beliefs of individuals and groups involved in risk management and therefore is considered incomplete. Further, literature on organisational knowledge management presents several definitions and categorisations of knowledge and approaches for knowledge manipulation in the organisational context as a whole. However, there is no specific approach regarding "how to deal" with knowledge in the course of organisational risk management. The research is based on a single case study, on a N.H.S. (Foundation) Trust, is influenced by principles of interpretivism and the frame of mind of Soft Systems Methodology (S.S.M.) to investigate the management of risk registers, from the viewpoint of people involved in the situation. Data revealed that knowledge about risks and about the existing risk management policy and procedures is situated in several locations in the Trust and is neither consolidated nor present where and when required. This study proposes a framework that identifies required knowledge for each of the risk management processes and outlines methods for conversion of this knowledge, based on the SECI knowledge conversion model, and activities to facilitate knowledge conversion so that knowledge is effectively used for the development of risk registers and the monitoring of risks throughout the whole Trust under study. This study has theoretical impact in the management science literature as it addresses the issue of incomplete knowledge raised in the risk management literature using concepts of the knowledge management literature, such as the knowledge conversion model. In essence, the combination of required risk and risk management related knowledge with the required type of communication for risk management creates the proposed methods for the support of each risk management process for the risk registers. Further, the indication of the importance of knowledge in risk management and the presentation of a framework that consolidates knowledge required for the risk management processes and proposes way(s) for the communication of this knowledge within a healthcare organisation have practical impact in the management of healthcare organisations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis documents the design, implementation and testing of a smart sensing platform that is able to discriminate between differences or small changes in a persons walking. The distributive tactile sensing method is used to monitor the deflection of the platform surface using just a small number of sensors and, through the use of neural networks, infer the characteristics of the object in contact with the surface. The thesis first describes the development of a mathematical model which uses a novel method to track the position of a moving load as it passes over the smart sensing surface. Experimental methods are then described for using the platform to track the position of swinging pendulum in three dimensions. It is demonstrated that the method can be extended to that of real-time measurement of balance and sway of a person during quiet standing. Current classification methods are then investigated for use in the classification of different gait patterns, in particular to identify individuals by their unique gait pattern. Based on these observations, a novel algorithm is developed that is able to discriminate between abnormal and affected gait. This algorithm, using the distributive tactile sensing method, was found to have greater accuracy than other methods investigated and was designed to be able to cope with any type of gait variation. The system developed in this thesis has applications in the area of medical diagnostics, either as an initial screening tool for detecting walking disorders or to be able to automatically detect changes in gait over time. The system could also be used as a discrete biometric identification method, for example identifying office workers as they pass over the surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last major study of sales performance variance explained by salespeople attributes was by Churchill et al. (1985). They examined the effect of role, skills, motivation, personal factors, aptitude, and organizational/environmental factors on sales performance—factors that have dominated the sales performance area. About the same time, Weitz, Sujan, and Sujan (1986) introduced the concepts of salespeople's knowledge structures. Considerable work on the relationship of the elements of knowledge structures and performance can be found in the literature. In this research note, we determine the degree to which sales performance can be explained by knowledge structure variables, a heretofore unexplored area. If knowledge structure variables explain more variance than traditional variables, then this paper would be a call to further research in this area. In examining this research question in a retail context, we find that knowledge structure variables explain 50.2 percent of the variance in sales performance. We also find that variance explained by knowledge structures is significantly different based on gender. The impact of knowledge structures on performance was higher for men than for women. The models using education demonstrated smaller differences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-agent systems are complex systems comprised of multiple intelligent agents that act either independently or in cooperation with one another. Agent-based modelling is a method for studying complex systems like economies, societies, ecologies etc. Due to their complexity, very often mathematical analysis is limited in its ability to analyse such systems. In this case, agent-based modelling offers a practical, constructive method of analysis. The objective of this book is to shed light on some emergent properties of multi-agent systems. The authors focus their investigation on the effect of knowledge exchange on the convergence of complex, multi-agent systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rationale for carrying out this research was to address the clear lack of knowledge surrounding the measurement of public hospital performance in Ireland. The objectives of this research were to develop a comprehensive model for measuring hospital performance and using this model to measure the performance of public acute hospitals in Ireland in 2007. Having assessed the advantages and disadvantages of various measurement models the Data Envelopment Analysis (DEA) model was chosen for this research. DEA was initiated by Charnes, Cooper and Rhodes in 1978 and further developed by Fare et al. (1983) and Banker et al. (1984). The method used to choose relevant inputs and outputs to be included in the model followed that adopted by Casu et al. (2005) which included the use of focus groups. The main conclusions of the research are threefold. Firstly, it is clear that each stakeholder group has differing opinions on what constitutes good performance. It is therefore imperative that any performance measurement model would be designed within parameters that are clearly understood by any intended audience. Secondly, there is a lack of publicly available qualitative information in Ireland that inhibits detailed analysis of hospital performance. Thirdly, based on available qualitative and quantitative data the results indicated a high level of efficiency among the public acute hospitals in Ireland in their staffing and non pay costs, averaging 98.5%. As DEA scores are sensitive to the number of input and output variables as well as the size of the sample it should be borne in mind that a high level of efficiency could be as a result of using DEA with too many variables compared to the number of hospitals. No hospital was deemed to be scale efficient in any of the models even though the average scale efficiency for all of the hospitals was relatively high at 90.3%. Arising from this research the main recommendations would be that information on medical outcomes, survival rates and patient satisfaction should be made publicly available in Ireland; that despite a high average efficiency level that many individual hospitals need to focus on improving their technical and scale efficiencies, and that performance measurement models should be developed that would include more qualitative data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: There is a growing public perception that serious medical error is commonplace and largely tolerated by the medical profession. The Government and medical establishment's response to this perceived epidemic of error has included tighter controls over practising doctors and individual stick-and-carrot reforms of medical practice. Discussion: This paper critically reviews the literature on medical error, professional socialization and medical student education, and suggests that common themes such as uncertainty, necessary fallibility, exclusivity of professional judgement and extensive use of medical networks find their genesis, in part, in aspects of medical education and socialization into medicine. The nature and comparative failure of recent reforms of medical practice and the tension between the individualistic nature of the reforms and the collegiate nature of the medical profession are discussed. Conclusion: A more theoretically informed and longitudinal approach to decreasing medical error might be to address the genesis of medical thinking about error through reforms to the aspects of medical education and professional socialization that help to create and perpetuate the existence of avoidable error, and reinforce medical collusion concerning error. Further changes in the curriculum to emphasize team working, communication skills, evidence-based practice and strategies for managing uncertainty are therefore potentially key components in helping tomorrow's doctors to discuss, cope with and commit fewer medical errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the development of the Internet culture applications are becoming simpler and simpler, users need less IT knowledge than earlier; from the ‘reader’ status they have reached that of the content creator and editor. In our days, the effects of the web are becoming stronger and stronger— computer-aided work is conventional almost everywhere. The spread of the Internet applications has several reasons: first of all, their accessibility is widespread; second, their use is not limited to only one computer or network on which they have been installed. Also, the quantity of accessible information now and earlier is not even comparable. Not counting the applications which need high broadband or high counting capacity (for example video editing), Internet applications are reaching the functionality of the thick clients associates. The most serious disadvantage of Internet applications – for security reasons — is that the resources of the client computer are not fully accessible or accessible only to a restricted extent. Still thick clients do have some advantages: better multimedia perdormance with more flexibility due to local resources and the possibility for offline working.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Congenital nystagmus (CN) is an ocular-motor disorder characterised by involuntary, conjugated ocular oscillations and its pathogenesis is still under investigation. This kind of nystagmus is termed congenital (or infantile) since it could be present at birth or it can arise in the first months of life. Most of CN patients show a considerable decrease of their visual acuity: image fixation on the retina is disturbed by nystagmus continuous oscillations, mainly horizontal. However, the image of a given target can still be stable during short periods in which eye velocity slows down while the target image is placed onto the fovea (called foveation intervals). To quantify the extent of nystagmus, eye movement recording are routinely employed, allowing physicians to extract and analyse nystagmus main features such as waveform shape, amplitude and frequency. Using eye movement recording, it is also possible to compute estimated visual acuity predictors: analytical functions which estimates expected visual acuity using signal features such as foveation time and foveation position variability. Use of those functions extend the information from typical visual acuity measurement (e.g. Landolt C test) and could be a support for therapy planning or monitoring. This study focuses on detection of CN patients' waveform type and on foveation time measure. Specifically, it proposes a robust method to recognize cycles corresponding to the specific CN waveform in the eye movement pattern and, for those cycles, evaluate the exact signal tracts in which a subject foveates. About 40 eyemovement recordings, either infrared-oculographic or electrooculographic, were acquired from 16 CN subjects. Results suggest that the use of an adaptive threshold applied to the eye velocity signal could improve the estimation of slow phase start point. This can enhance foveation time computing and reduce influence of repositioning saccades and data noise on the waveform type identification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An effective mathematical method of new knowledge obtaining on the structure of complex objects with required properties is developed. The method comprehensively takes into account information on the properties and relations of primary objects, composing the complex objects. It is based on measurement of distances between the predicate groups with some interpretation of them. The optimal measure for measurement of these distances with the maximal discernibleness of different groups of predicates is constructed. The method is tested on solution of the problem of obtaining of new compound with electro-optical properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* The work was supported by the RFBR under Grants N07-01-00331a, 08-07-00136a

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article takes stock of the current state of research on knowledge processes in virtual teams (VTs) and consolidates the extent research findings. Virtual teams, on the one hand, constitute important organisational entities that facilitate the integration of diverse and distributed knowledge resources. On the other hand, collaborating in a virtual environment creates particular challenges for the knowledge processes. The article seeks to consolidate the diverse evidence on knowledge processes in VTs with a specific focus on identifying the factors that influence the effectiveness of these knowledge processes. The article draws on the four basic knowledge processes outlined by Alavi and Leidner (2001) (i.e. creation, transferring, storage/retrieval and application) to frame the investigation and discuss the extent research. The consolidation of the existing research findings allows us to recognise the gaps in the understanding of knowledge processes in VTs and identify the important avenues for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subject of dropout prevention/reduction is deservedly receiving attention as a problem that, if not resolved, could threaten our national future.^ This study investigates a small segment of the overall dropout problem, which has apparently unique features of program design and population selection. The evidence presented here should add to the knowledge bank of this complicated problem.^ Project Trio was one of a number of dropout prevention programs and activities which were conducted in Dade County school years 1984-85 and 1985-86, and it is here investigated longitudinally through the end of the 1987-88 school year. It involved 17 junior and senior high schools, and 27 programs, 10 the first year and 17 the second, with over 1,000 total students, who had been selected by the schools from a list of the "at risk" students provided by the district, and were divided approximately evenly into the classical research design of an experimental group and the control group, which following standard procedure was to take the regular school curriculum. No school had more than 25 students in either group.^ Each school modified the basic design of the project to accommodate the individual school characteristics and the perceived needs of their students; however all schools projects were to include some form of academic enhancement, counseling and career awareness study.^ The conclusion of this study was that the control group had a significantly lower dropout rate than the experimental group. Though impossible to make a certain determination of the reasons for this unexpected result, it appears from evidence presented that one cause may have been inadequate administration at the local level.^ This study was also a longitudinal investigation of the "at risk" population as a whole for the three and four year period, to determine if academic factors were present in records may be used to identify dropout proneness.^ A significant correlation was found between dropping out and various measures including scores on the Quality of School Life Instrument, attendance, grade point averages, mathematics grades, and overage in grade, important identifiers in selection for dropout prevention programs. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many culturally and linguistically diverse (CLD) students with specific learning disabilities (SLD) struggle with the writing process. Particularly, they have difficulties developing and expanding ideas, organizing and elaborating sentences, and revising and editing their compositions (Graham, Harris, & Larsen, 2001; Myles, 2002). Computer graphic organizers offer a possible solution to assist them in their writing. This study investigated the effects of a computer graphic organizer on the persuasive writing compositions of Hispanic middle school students with SLD. A multiple baseline design across subjects was used to examine its effects on six dependent variables: number of arguments and supporting details, number and percentage of transferred arguments and supporting details, planning time, writing fluency, syntactical maturity (measured by T-units, the shortest grammatical sentence without fragments), and overall organization. Data were collected and analyzed throughout baseline and intervention. Participants were taught persuasive writing and the writing process prior to baseline. During baseline, participants were given a prompt and asked to use paper and pencil to plan their compositions. A computer was used for typing and editing. Intervention required participants to use a computer graphic organizer for planning and then a computer for typing and editing. The planning sheets and written composition were printed and analyzed daily along with the time each participant spent on planning. The use of computer graphic organizers had a positive effect on the planning and persuasive writing compositions. Increases were noted in the number of supporting details planned, percentage of supporting details transferred, planning time, writing fluency, syntactical maturity in number of T-units, and overall organization of the composition. Minimal to negligible increases were noted in the mean number of arguments planned and written. Varying effects were noted in the percent of transferred arguments and there was a decrease in the T-unit mean length. This study extends the limited literature on the effects of computer graphic organizers as a prewriting strategy for Hispanic students with SLD. In order to fully gauge the potential of this intervention, future research should investigate the use of different features of computer graphic organizer programs, its effects with other writing genres, and different populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

http://digitalcommons.fiu.edu/com_images/1101/thumbnail.jpg

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.