969 resultados para computer evidence


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of evidence-based medicine is to uniformly apply evidence gained from scientific research to aspects of clinical practice. In order to achieve this goal, new applications that integrate increasingly disparate health care information resources are required. Access to and provision of evidence must be seamlessly integrated with existing clinical workflow and evidence should be made available where it is most often required - at the point of care. In this paper we address these requirements and outline a concept-based framework that captures the context of a current patient-physician encounter by combining disease and patient-specific information into a logical query mechanism for retrieving relevant evidence from the Cochrane Library. Returned documents are organized by automatically extracting concepts from the evidence-based query to create meaningful clusters of documents which are presented in a manner appropriate for point of care support. The framework is currently being implemented as a prototype software agent that operates within the larger context of a multi-agent application for supporting workflow management of emergency pediatric asthma exacerbations. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis initially presents an 'assay' of the literature pertaining to individual differences in human-computer interaction. A series of experiments is then reported, designed to investigate the association between a variety of individual characteristics and various computer task and interface factors. Predictor variables included age, computer expertise, and psychometric tests of spatial visualisation, spatial memory, logical reasoning, associative memory, and verbal ability. These were studied in relation to a variety of computer-based tacks, including: (1) word processing and its component elements; (ii) the location of target words within passages of text; (iii) the navigation of networks and menus; (iv) command generation using menus and command line interfaces; (v) the search and selection of icons and text labels; (vi) information retrieval. A measure of self-report workload was also included in several of these experiments. The main experimental findings included: (i) an interaction between spatial ability and the manipulation of semantic but not spatial interface content; (ii) verbal ability being only predictive of certain task components of word processing; (iii) age differences in word processing and information retrieval speed but not accuracy; (iv) evidence of compensatory strategies being employed by older subjects; (v) evidence of performance strategy differences which disadvantaged high spatial subjects in conditions of low spatial information content; (vi) interactive effects of associative memory, expertise and command strategy; (vii) an association between logical reasoning and word processing but not information retrieval; (viii) an interaction between expertise and cognitive demand; and (ix) a stronger association between cognitive ability and novice performance than expert performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many manufacturing companies have long endured the problems associated with the presence of `islands of automation'. Due to rapid computerisation, `islands' such as Computer-Aided Design (CAD), Computer-Aided Manufacturing (CAM), Flexible Manufacturing Systems (FMS) and Material Requirement Planning (MRP), have emerged, and with a lack of co-ordination, often lead to inefficient performance of the overall system. The main objective of Computer-Integrated Manufacturing (CIM) technology is to form a cohesive network between these islands. Unfortunately, a commonly used approach - the centralised system approach, has imposed major technical constraints and design complication on development strategies. As a consequence, small companies have experienced difficulties in participating in CIM technology. The research described in this thesis has aimed to examine alternative approaches to CIM system design. Through research and experimentation, the cellular system approach, which has existed in the form of manufacturing layouts, has been found to simplify the complexity of an integrated manufacturing system, leading to better control and far higher system flexibility. Based on the cellular principle, some central management functions have also been distributed to smaller cells within the system. This concept is known, specifically, as distributed planning and control. Through the development of an embryo cellular CIM system, the influence of both the cellular principle and the distribution methodology have been evaluated. Based on the evidence obtained, it has been concluded that distributed planning and control methodology can greatly enhance cellular features within an integrated system. Both the cellular system approach and the distributed control concept will therefore make significant contributions to the design of future CIM systems, particularly systems designed with respect to small company requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In response to the increasing international competitiveness, many manufacturing businesses are rethinking their management strategies and philosophies towards achieving a computer integrated environment. The explosive growth in Advanced Manufacturing Technology (AMI) has resulted in the formation of functional "Islands of Automation" such as Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), Computer Aided Process Planning (CAPP) and Manufacturing Resources Planning (MRPII). This has resulted in an environment which has focussed areas of excellence and poor overall efficiency, co-ordination and control. The main role of Computer Integrated Manufacturing (CIM) is to integrate these islands of automation and develop a totally integrated and controlled environment. However, the various perceptions of CIM, although developing, remain focussed on a very narrow integration scope and have consequently resulted in mere linked islands of automation with little improvement in overall co-ordination and control. This thesis, that is the research described within, develops and examines a more holistic view of CIM, which is based on the integration of various business elements. One particular business element, namely control, has been shown to have a multi-facetted and underpinning relationship with the CIM philosophy. This relationship impacts various CIM system design aspects including the CIM business analysis and modelling technique, the specification of systems integration requirements, the CIM system architectural form and the degree of business redesign. The research findings show that fundamental changes to CIM system design are required; these are incorporated in a generic CIM design methodology. The affect and influence of this holistic view of CIM on a manufacturing business has been evaluated through various industrial case study applications. Based on the evidence obtained, it has been concluded that this holistic, control based approach to CIM can provide a greatly improved means of achieving a totally integrated and controlled business environment. This generic CIM methodology will therefore make a significant contribution to the planning, modelling, design and development of future CIM systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The manufacture of copper alloy flat rolled metals involves hot and cold rolling operations, together with annealing and other secondary processes, to transform castings (mainly slabs and cakes) into such shapes as strip, plate, sheet, etc. Production is mainly to customer orders in a wide range of specifications for dimensions and properties. However, order quantities are often small and so process planning plays an important role in this industry. Much research work has been done in the past in relation to the technology of flat rolling and the details of the operations, however, there is little or no evidence of any research in the planning of processes for this type of manufacture. Practical observation in a number of rolling mills has established the type of manual process planning traditionally used in this industry. This manual approach, however, has inherent drawbacks, being particularly dependent on the individual planners who gain their knowledge over a long span of practical experience. The introduction of the retrieval CAPP approach to this industry was a first step to reduce these problems. But this could not provide a long-term answer because of the need for an experienced planner to supervise generation of any plan. It also fails to take account of the dynamic nature of the parameters involved in the planning, such as the availability of resources, operation conditions and variations in the costs. The other alternative is the use of a generative approach to planning in the rolling mill context. In this thesis, generative methods are developed for the selection of optimal routes for single orders and then for batches of orders, bearing in mind equipment restrictions, production costs and material yield. The batch order process planning involves the use of a special cluster analysis algorithm for optimal grouping of the orders. This research concentrates on cold-rolling operations. A prototype model of the proposed CAPP system, including both single order and batch order planning options, has been developed and tested on real order data in the industry. The results were satisfactory and compared very favourably with the existing manual and retrieval methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Links the concept of market-driven business strategies with the design of production systems. It draws upon the case of a firm which, during the last decade, changed its strategy from being “technology led” to “market driven”. The research, based on interdisciplinary fieldwork involving long-term participant observation, investigated the factors which contribute to the successful design and implementation of flexible production systems in electronics assembly. These investigations were conducted in collaboration with a major computer manufacturer, with other electronics firms being studied for comparison. The research identified a number of strategies and actions seen as crucial to the development of efficient flexible production systems, namely: effective integration of subsystems, development of appropriate controls and performance measures, compatibility between production system design and organization structure, and the development of a climate conducive to organizational change. Overall, the analysis suggests that in the electronics industry there exists an extremely high degree of environmental complexity and turbulence. This serves to shape the strategic, technical and social structures that are developed to match this complexity, examples of which are niche marketing, flexible manufacturing and employee harmonization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automated ontology population using information extraction algorithms can produce inconsistent knowledge bases. Confidence values assigned by the extraction algorithms may serve as evidence in helping to repair inconsistencies. The Dempster-Shafer theory of evidence is a formalism, which allows appropriate interpretation of extractors’ confidence values. This chapter presents an algorithm for translating the subontologies containing conflicts into belief propagation networks and repairing conflicts based on the Dempster-Shafer plausibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed-content miscellanies (very frequent in the Byzantine and mediaeval Slavic written heritage) are usually defined as collections of works with non-occupational, non-liturgical application, and texts in them are selected and arranged according to no identifiable principle. It is a “readable” type of miscellanies which were compiled mainly on the basis of the cognitive interests of compilers and readers. Just like the occupational ones, they also appeared to satisfy public needs but were intended for individual usage. My textological comparison had shown that mixed- content miscellanies often showed evidence of a stable content – some of them include the same constituent works in the same order, regardless that the manuscripts had no obvious genetic relationship. These correspondences were sufficiently numerous and distinctive that they could not be merely fortuitous, and the only sensible interpretation was that even when the operative organizational principle was not based on independently identifiable criteria, such as the church calendar, liturgical function, or thematic considerations, mixed-content miscellanies (or, at least, portions of their contents) nonetheless fell into types. In this respect, the apparent free selection and arrangement of texts in mixed-content miscellanies turns out to be illusory. The problem was – as the corpus of manuscripts that I and my colleagues needed to examine grew – our ability to keep track of the structure of each one, and to identify structural correspondences among manuscripts within the corpus, diminished. So, at the end of 1993 I addressed a letter to Prof. David Birnbaum (University of Pittsburgh, PA) with a request to help me to solve the problem. He and my colleague Andrey Boyadzhiev (Sofia University) pointed out to me that computers are well suited to recording, processing, and analyzing large amounts of data, and to identifying patterns within the data, and their proposal was that we try to develop a computer system for description of manuscripts, for their analysis and of course, for searching the data. Our collaboration in this project is now ten years old, and our talk today presents an overview of that collaboration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to help researchers find valuable papers from a large literature collection. To this end, many graph-based ranking algorithms have been proposed. However, most of these algorithms suffer from the problem of ranking bias. Ranking bias hurts the usefulness of a ranking algorithm because it returns a ranking list with an undesirable time distribution. This paper is a focused study on how to alleviate ranking bias by leveraging the heterogeneous network structure of the literature collection. We propose a new graph-based ranking algorithm, MutualRank, that integrates mutual reinforcement relationships among networks of papers, researchers, and venues to achieve a more synthetic, accurate, and less-biased ranking than previous methods. MutualRank provides a unified model that involves both intra- and inter-network information for ranking papers, researchers, and venues simultaneously. We use the ACL Anthology Network as the benchmark data set and construct the gold standard from computer linguistics course websites of well-known universities and two well-known textbooks. The experimental results show that MutualRank greatly outperforms the state-of-the-art competitors, including PageRank, HITS, CoRank, Future Rank, and P-Rank, in ranking papers in both improving ranking effectiveness and alleviating ranking bias. Rankings of researchers and venues by MutualRank are also quite reasonable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All A’s was designed to support of the agency’s family strengthening initiatives in South Florida. All A’s uses evidence informed strategies poised to be an inclusive curriculum that teaches self-determination and adaptive behavior skills. The framework incorporates problem based learning and adult learning theory and follows the Universal Design for Learning. Since 2012, the agency has served over 8500 youth and 4,000 adults using the framework. The framework addresses educational underachievement and career readiness in at risk populations. It is used to enhance participants AWARENESS of setting SMART goals to achieve future goals and career aspirations. Participants are provided with ACCESS to resources and opportunities for creating and implementing an ACTION plan as they pursue and ACHIEVE their goals. All A’s promotes protective factors and expose youth to career pathways in Science, Technology, Engineering and Math (STEM) related fields. Youth participate in college tours, job site visits, job shadowing, high school visits, online college and career preparation assistance, service learning projects, STEM projects, and the Winning Futures© mentoring program. Adults are assisted with résumé development; learn job search strategies, interview techniques, job shadowing experiences, computer and financial literacy programs. Adults and youth are also given the opportunity to complete industry-recognized certifications in high demand industries (food service, general labor, and construction), and test preparation for the General Educational Development Test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lactase persistence, the ability to digest the milk sugar lactose in adulthood, is highly associated with a T allele situated 13,910 bp upstream from the actual lactase gene in Europeans. The frequency of this allele rose rapidly in Europe after transition from hunter–gatherer to agriculturalist lifestyles and the introduction of milkable domestic species from Anatolia some 8000 years ago. Here we first introduce the archaeological and historic background of early farming life in Europe, then summarize what is known of the physiological and genetic mechanisms of lactase persistence. Finally, we compile the evidence for a co-evolutionary process between dairying culture and lactase persistence. We describe the different hypotheses on how this allele spread over Europe and the main evolutionary forces shaping this process. We also summarize three different computer simulation approaches, which offer a means of developing a coherent and integrated understanding of the process of spread of lactase persistence and dairying.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Many school-based interventions are being delivered in the absence of evidence of effectiveness (Snowling & Hulme, 2011, Br. J. Educ. Psychol., 81, 1).Aim: This study sought to address this oversight by evaluating the effectiveness of the commonly used the Lexia Reading Core5 intervention, with 4- to 6-year-old pupils in Northern Ireland.Sample: A total of 126 primary school pupils in year 1 and year 2 were screened on the Phonological Assessment Battery 2nd Edition (PhAB-2). Children were recruited from the equivalent year groups to Reception and Year 1 in England and Wales, and Pre-kindergarten and Kindergarten in North America.
Methods: A total of 98 below-average pupils were randomized (T0) to either an 8-week block (inline image = 647.51 min, SD = 158.21) of daily access to Lexia Reading Core5 (n = 49) or a waiting-list control group (n = 49). Assessment of phonological skills was completed at post-intervention (T1) and at 2-month follow-up (T2) for the intervention group only.
Results: Analysis of covariance which controlled for baseline scores found that the Lexia Reading Core5 intervention group made significantly greater gains in blending, F(1, 95) = 6.50, p = .012, partial η2 = .064 (small effect size) and non-word reading, F(1, 95) = 7.20, p = .009, partial η2 = .070 (small effect size). Analysis of the 2-month follow-up of the intervention group found that all group treatment gains were maintained. However, improvements were not uniform among the intervention group with 35% failing to make progress despite access to support. Post-hoc analysis revealed that higher T0 phonological working memory scores predicted improvements made in phonological skills.
Conclusions: An early-intervention, computer-based literacy program can be effective in boosting the phonological skills of 4- to 6-year-olds, particularly if these literacy difficulties are not linked to phonological working memory deficits.