954 resultados para Interoperability Protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims/Purpose: Protocols are evidenced-based structured guides for directing care to achieve improvements. But translating that evidence into practice is a major challenge. It is not acceptable to simply introduce the protocol and expect it to be adopted and lead to change in practice. Implementation requires effective leadership and management. This presentation describes a strategy for implementation that should promote successful adoption and lead to practice change.
Presentation description: There are many social and behavioural change models to assist and guide practice change. Choosing a model to guide implementation is important for providing a framework for action. The change process requires careful thought, from the protocol itself to the policies and politics within the ICU. In this presentation, I discuss a useful pragmatic guide called the 6SQUID (6 Steps in QUality Intervention Development). This was initially designed for public health interventions, but the model has wider applicability and has similarities with other change process models. Steps requiring consideration include examining the purpose and the need for change; the staff that will be affected and the impact on their workload; and the evidence base supporting the protocol. Subsequent steps in the process that the ICU manager should consider are the change mechanism (widespread multi-disciplinary consultation; adapting the protocol to the local ICU); and identifying how to deliver the change mechanism (educational workshops and preparing staff for the changes are imperative). Recognising the barriers to implementation and change and addressing these locally is also important. Once the protocol has been implemented, there is generally a learning curve before it becomes embedded in practice. Audit and feedback on adherence are useful strategies to monitor and sustain the changes.
Conclusion: Managing change successfully will promote a positive experience for staff. In turn, this will encourage a culture of enthusiasm for translating evidence into practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a European BIOMED-2 collaborative study, multiplex PCR assays have successfully been developed and standardized for the detection of clonally rearranged immunoglobulin (Ig) and T-cell receptor (TCR) genes and the chromosome aberrations t(11;14) and t(14;18). This has resulted in 107 different primers in only 18 multiplex PCR tubes: three VH-JH, two DH-JH, two Ig kappa (IGK), one Ig lambda (IGL), three TCR beta (TCRB), two TCR gamma (TCRG), one TCR delta (TCRD), three BCL1-Ig heavy chain (IGH), and one BCL2-IGH. The PCR products of Ig/TCR genes can be analyzed for clonality assessment by heteroduplex analysis or GeneScanning. The detection rate of clonal rearrangements using the BIOMED-2 primer sets is unprecedentedly high. This is mainly based on the complementarity of the various BIOMED-2 tubes. In particular, combined application of IGH (VH-JH and DH-JH) and IGK tubes can detect virtually all clonal B-cell proliferations, even in B-cell malignancies with high levels of somatic mutations. The contribution of IGL gene rearrangements seems limited. Combined usage of the TCRB and TCRG tubes detects virtually all clonal T-cell populations, whereas the TCRD tube has added value in case of TCRgammadelta(+) T-cell proliferations. The BIOMED-2 multiplex tubes can now be used for diagnostic clonality studies as well as for the identification of PCR targets suitable for the detection of minimal residual disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scientific workflows orchestrate the execution of complex experiments frequently using distributed computing platforms. Meta-workflows represent an emerging type of such workflows which aim to reuse existing workflows from potentially different workflow systems to achieve more complex and experimentation minimizing workflow design and testing efforts. Workflow interoperability plays a profound role in achieving this objective. This paper is focused at fostering interoperability across meta-workflows that combine workflows of different workflow systems from diverse scientific domains. This is achieved by formalizing definitions of meta-workflow and its different types to standardize their data structures used to describe workflows to be published and shared via public repositories. The paper also includes thorough formalization of two workflow interoperability approaches based on this formal description: the coarse-grained and fine-grained workflow interoperability approach. The paper presents a case study from Astrophysics which successfully demonstrates the use of the concepts of meta-workflows and workflow interoperability within a scientific simulation platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical exercise programmes are routinely prescribed in clinical practice to treat impairments, improve activity and participation in daily life because of their known physiological, health and psychological benefits (RCP, 2009). Progressive resistance exercise is a type of exercise prescribed specifically to improve skeletal muscle strength (Latham et al., 2004). The effectiveness of progressive resistance exercise varies considerably between studies and populations. This thesis focuses on how training parameters influence the delivery of progressive resistance exercise. In order to appropriately evaluate the influence of training parameters, this thesis argues the need to record training performance and the total work completed by participants as prescribed by training protocols. In the first study, participants were taken through a series of protocols differentiated by the intensity and volume of training. Training intensity was defined as a proportion of the mean peak torque achieved during maximal voluntary contractions and was set at 80% and 40% respectively of the MVC mean peak torque. Training volume was defined as the total external work achieved over the training period. Measures of training performance were developed to accurately report the intensity, repetitions and work completed during the training period. A second study evaluated training performance of the training protocols over repeated sessions. These protocols were then applied to 3 stroke survivors. Study 1 found sedentary participants could achieve a differentiated training intensity. Participants completing the high and low intensity protocols trained at 80% and 40% respectively of the MVC mean peak torque. The total work achieved in the high intensity low repetition protocol was lower than the total work achieved in the low intensity high repetition protocol. With repeated practice, study 2 found participants were able to improve in their ability to perform manoeuvres as shown by a reduction in the variation of the mean training intensity achieving total work as specified by the protocol to a lower margin of error. When these protocols were applied to 3 stroke survivors, they were able to achieve the specified training intensity but they were not able to achieve the total work as expected for the protocol. This is likely to be due to an inability in achieving a consistent force throughout the contraction. These results demonstrate evaluation of training characteristics and support the need to record and report training performance characteristics during progressive resistance exercise, including the total work achieved, in order to elucidate the influence of training parameters on progressive resistance exercise. The lack of accurate training performance may partly explain the inconsistencies between studies on optimal training parameters for progressive resistance exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cancer remains an undetermined question for modern medicine. Every year millions of people ranging from children to adult die since the modern treatment is unable to meet the challenge. Research must continue in the area of new biomarkers for tumors. Molecular biology has evolved during last years; however, this knowledge has not been applied into the medicine. Biological findings should be used to improve diagnostics and treatment modalities. In this thesis, human formalin-fixed paraffin embedded colorectal and breast cancer samples were used to optimize the double immunofluorescence staining protocol. Also, immunohistochemistry was performed in order to visualize expression patterns of each biomarker. Concerning double immunofluorescence, feasibility of primary antibodies raised in different and same host species was also tested. Finally, established methods for simultaneous multicolor immunofluorescence imaging of formalin-fixed paraffin embedded specimens were applied for the detection of pairs of potential biomarkers of colorectal cancer (EGFR, pmTOR, pAKT, Vimentin, Cytokeratin Pan, Ezrin, E-cadherin) and breast cancer (Securin, PTTG1IP, Cleaved caspase 3, ki67).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended formulation of a polyhedron P is a linear description of a polyhedron Q together with a linear map π such that π(Q)=P. These objects are of fundamental importance in polyhedral combinatorics and optimization theory, and the subject of a number of studies. Yannakakis’ factorization theorem (Yannakakis in J Comput Syst Sci 43(3):441–466, 1991) provides a surprising connection between extended formulations and communication complexity, showing that the smallest size of an extended formulation of $$P$$P equals the nonnegative rank of its slack matrix S. Moreover, Yannakakis also shows that the nonnegative rank of S is at most 2c, where c is the complexity of any deterministic protocol computing S. In this paper, we show that the latter result can be strengthened when we allow protocols to be randomized. In particular, we prove that the base-2 logarithm of the nonnegative rank of any nonnegative matrix equals the minimum complexity of a randomized communication protocol computing the matrix in expectation. Using Yannakakis’ factorization theorem, this implies that the base-2 logarithm of the smallest size of an extended formulation of a polytope P equals the minimum complexity of a randomized communication protocol computing the slack matrix of P in expectation. We show that allowing randomization in the protocol can be crucial for obtaining small extended formulations. Specifically, we prove that for the spanning tree and perfect matching polytopes, small variance in the protocol forces large size in the extended formulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transmitting sensitive data over non-secret channels has always required encryption technologies to ensure that the data arrives without exposure to eavesdroppers. The Internet has made it possible to transmit vast volumes of data more rapidly and cheaply and to a wider audience than ever before. At the same time, strong encryption makes it possible to send data securely, to digitally sign it, to prove it was sent or received, and to guarantee its integrity. The Internet and encryption make bulk transmission of data a commercially viable proposition. However, there are implementation challenges to solve before commercial bulk transmission becomes mainstream. Powerful have a performance cost, and may affect quality of service. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Performance degradation and potential for commercial loss discourage the bulk transmission of data over the Internet in any commercial application. This paper outlines technical solutions to these problems. We develop new technologies and combine existing ones in new and powerful ways to minimise commercial loss without compromising performance or inflating overheads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus) as well as recognized security standards (e.g. eXtensible Access Control Markup Language) and were completely prototyped leveraging the SwitchYard ESB product.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A rapid rate and high percentage of macadamia nut germination, together with production of vigorous seedlings, are required by nurseries and breeding programs. Germination of nuts is typically protracted, however, and rarely reaches 100%. Many studies have been conducted into macadamia germination, but most have assessed percent germination only. This study investigated the effects of various treatments on percent germination, germination rate, and plant, shoot and root dry weights. The treatments tested were combinations of: (i) soaking or not soaking seeds in a dilute fungicide solution prior to planting; (ii) four different planting media; and (iii) leaving seed trays open or placing them inside clear plastic bags. For freshly harvested nuts, sowing in potting mix under clear plastic and without soaking produced the highest percent germination and germination rate, the largest shoots, and longest lateral roots.