893 resultados para Implementation Model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Patients, clinicians, researchers and payers are seeking to understand the value of using genomic information (as reflected by genotyping, sequencing, family history or other data) to inform clinical decision-making. However, challenges exist to widespread clinical implementation of genomic medicine, a prerequisite for developing evidence of its real-world utility. METHODS: To address these challenges, the National Institutes of Health-funded IGNITE (Implementing GeNomics In pracTicE; www.ignite-genomics.org ) Network, comprised of six projects and a coordinating center, was established in 2013 to support the development, investigation and dissemination of genomic medicine practice models that seamlessly integrate genomic data into the electronic health record and that deploy tools for point of care decision making. IGNITE site projects are aligned in their purpose of testing these models, but individual projects vary in scope and design, including exploring genetic markers for disease risk prediction and prevention, developing tools for using family history data, incorporating pharmacogenomic data into clinical care, refining disease diagnosis using sequence-based mutation discovery, and creating novel educational approaches. RESULTS: This paper describes the IGNITE Network and member projects, including network structure, collaborative initiatives, clinical decision support strategies, methods for return of genomic test results, and educational initiatives for patients and providers. Clinical and outcomes data from individual sites and network-wide projects are anticipated to begin being published over the next few years. CONCLUSIONS: The IGNITE Network is an innovative series of projects and pilot demonstrations aiming to enhance translation of validated actionable genomic information into clinical settings and develop and use measures of outcome in response to genome-based clinical interventions using a pragmatic framework to provide early data and proofs of concept on the utility of these interventions. Through these efforts and collaboration with other stakeholders, IGNITE is poised to have a significant impact on the acceleration of genomic information into medical practice.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parallel processing techniques have been used in the past to provide high performance computing resources for activities such as fire-field modelling. This has traditionally been achieved using specialized hardware and software, the expense of which would be difficult to justify for many fire engineering practices. In this article we demonstrate how typical office-based PCs attached to a Local Area Network has the potential to offer the benefits of parallel processing with minimal costs associated with the purchase of additional hardware or software. It was found that good speedups could be achieved on homogeneous networks of PCs, for example a problem composed of ~100,000 cells would run 9.3 times faster on a network of 12 800MHz PCs than on a single 800MHz PC. It was also found that a network of eight 3.2GHz Pentium 4 PCs would run 7.04 times faster than a single 3.2GHz Pentium computer. A dynamic load balancing scheme was also devised to allow the effective use of the software on heterogeneous PC networks. This scheme also ensured that the impact between the parallel processing task and other computer users on the network was minimized.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computer egress simulation has potential to be used in large scale incidents to provide live advice to incident commanders. While there are many considerations which must be taken into account when applying such models to live incidents, one of the first concerns the computational speed of simulations. No matter how important the insight provided by the simulation, numerical hindsight will not prove useful to an incident commander. Thus for this type of application to be useful, it is essential that the simulation can be run many times faster than real time. Parallel processing is a method of reducing run times for very large computational simulations by distributing the workload amongst a number of CPUs. In this paper we examine the development of a parallel version of the buildingEXODUS software. The parallel strategy implemented is based on a systematic partitioning of the problem domain onto an arbitrary number of sub-domains. Each sub-domain is computed on a separate processor and runs its own copy of the EXODUS code. The software has been designed to work on typical office based networked PCs but will also function on a Windows based cluster. Two evaluation scenarios using the parallel implementation of EXODUS are described; a large open area and a 50 story high-rise building scenario. Speed-ups of up to 3.7 are achieved using up to six computers, with high-rise building evacuation simulation achieving run times of 6.4 times faster than real time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Continuous large-scale changes in technology and the globalization of markets have resulted in the need for many SMEs to use innovation as a means of seeking competitive advantage where innovation includes both technological and organizational perspectives (Tapscott, 2009). However, there is a paucity of systematic and empirical research relating to the implementation of innovation management in the context of SMEs. The aim of this article is to redress this imbalance via an empirical study created to develop and test a model of innovation implementation in SMEs. This study uses Structural Equation Modelling (SEM) to test the plausibility of an innovation model, developed from earlier studies, as the basis of a questionnaire survey of 395 SMEs in the UK. The resultant model and construct relationship results are further probed using an explanatory multiple case analysis to explore ‘how’ and ‘why’ type questions within the model and construct relationships. The findings show that the
effects of leadership, people and culture on innovation implementation are mediated by business improvement activities relating to Total Quality Management/Continuous Improvement (TQM/CI) and product and process developments. It is concluded that SMEs have an opportunity to leverage existing quality and process improvement activities to move beyond continuous
improvement outcomes towards effective innovation implementation. The article concludes by suggesting areas suitable for further research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study evaluates the implementation of Menter's gamma-Re-theta Transition Model within the CFX12 solver for turbulent transition prediction on a natural laminar flow nacelle. Some challenges associated with this type of modeling have been identified. The computational fluid dynamics transitional flow simulation results are presented for a series of cruise cases with freestream Mach numbers ranging from 0.8 to 0.88, angles of attack from 2 to 0 degrees, and mass flow ratios from 0.60 to 0.75. These were validated with a series of wind-tunnel tests on the nacelle by comparing the predicted and experimental surface pressure distributions and transition locations. A selection of the validation cases are presented in this paper. In all cases, computational fluid dynamics simulations agreed reasonably well with the experiments. The results indicate that Menter's gamma-Re-theta Transition Model is capable of predicting laminar boundary-layer transition to turbulence on a nacelle. Nonetheless, some limitations exist in both the Menter's gamma-Re-theta Transition Model and in the implementation of the computational fluid dynamics model. The implementation of a more comprehensive experimental correlation in Menter's gamma-Re-theta Transition Model, preferably the ones from nacelle experiments, including the effects of compressibility and streamline curvature, is necessary for an accurate transitional flow simulation on a nacelle. In addition, improvements to the computational fluid dynamics model are also suggested, including the consideration of varying distributed surface roughness and an appropriate empirical correction derived from nacelle experimental transition location data.