955 resultados para cryptographic protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design and analysis of conceptually different cooling systems for the human heart preservation are numerically investigated. A heart cooling container with required connections was designed for a normal size human heart. A three-dimensional, high resolution human heart geometric model obtained from CT-angio data was used for simulations. Nine different cooling designs are introduced in this research. The first cooling design (Case 1) used a cooling gelatin only outside of the heart. In the second cooling design (Case 2), the internal parts of the heart were cooled via pumping a cooling liquid inside both the heart’s pulmonary and systemic circulation systems. An unsteady conjugate heat transfer analysis is performed to simulate the temperature field variations within the heart during the cooling process. Case 3 simulated the currently used cooling method in which the coolant is stagnant. Case 4 was a combination of Case 1 and Case 2. A linear thermoelasticity analysis was performed to assess the stresses applied on the heart during the cooling process. In Cases 5 through 9, the coolant solution was used for both internal and external cooling. For external circulation in Case 5 and Case 6, two inlets and two outlets were designed on the walls of the cooling container. Case 5 used laminar flows for coolant circulations inside and outside of the heart. Effects of turbulent flow on cooling of the heart were studied in Case 6. In Case 7, an additional inlet was designed on the cooling container wall to create a jet impinging the hot region of the heart’s wall. Unsteady periodic inlet velocities were applied in Case 8 and Case 9. The average temperature of the heart in Case 5 was +5.0oC after 1500 s of cooling. Multi-objective constrained optimization was performed for Case 5. Inlet velocities for two internal and one external coolant circulations were the three design variables for optimization. Minimizing the average temperature of the heart, wall shear stress and total volumetric flow rates were the three objectives. The only constraint was to keep von Mises stress below the ultimate tensile stress of the heart’s tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims/Purpose: Protocols are evidenced-based structured guides for directing care to achieve improvements. But translating that evidence into practice is a major challenge. It is not acceptable to simply introduce the protocol and expect it to be adopted and lead to change in practice. Implementation requires effective leadership and management. This presentation describes a strategy for implementation that should promote successful adoption and lead to practice change.
Presentation description: There are many social and behavioural change models to assist and guide practice change. Choosing a model to guide implementation is important for providing a framework for action. The change process requires careful thought, from the protocol itself to the policies and politics within the ICU. In this presentation, I discuss a useful pragmatic guide called the 6SQUID (6 Steps in QUality Intervention Development). This was initially designed for public health interventions, but the model has wider applicability and has similarities with other change process models. Steps requiring consideration include examining the purpose and the need for change; the staff that will be affected and the impact on their workload; and the evidence base supporting the protocol. Subsequent steps in the process that the ICU manager should consider are the change mechanism (widespread multi-disciplinary consultation; adapting the protocol to the local ICU); and identifying how to deliver the change mechanism (educational workshops and preparing staff for the changes are imperative). Recognising the barriers to implementation and change and addressing these locally is also important. Once the protocol has been implemented, there is generally a learning curve before it becomes embedded in practice. Audit and feedback on adherence are useful strategies to monitor and sustain the changes.
Conclusion: Managing change successfully will promote a positive experience for staff. In turn, this will encourage a culture of enthusiasm for translating evidence into practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a European BIOMED-2 collaborative study, multiplex PCR assays have successfully been developed and standardized for the detection of clonally rearranged immunoglobulin (Ig) and T-cell receptor (TCR) genes and the chromosome aberrations t(11;14) and t(14;18). This has resulted in 107 different primers in only 18 multiplex PCR tubes: three VH-JH, two DH-JH, two Ig kappa (IGK), one Ig lambda (IGL), three TCR beta (TCRB), two TCR gamma (TCRG), one TCR delta (TCRD), three BCL1-Ig heavy chain (IGH), and one BCL2-IGH. The PCR products of Ig/TCR genes can be analyzed for clonality assessment by heteroduplex analysis or GeneScanning. The detection rate of clonal rearrangements using the BIOMED-2 primer sets is unprecedentedly high. This is mainly based on the complementarity of the various BIOMED-2 tubes. In particular, combined application of IGH (VH-JH and DH-JH) and IGK tubes can detect virtually all clonal B-cell proliferations, even in B-cell malignancies with high levels of somatic mutations. The contribution of IGL gene rearrangements seems limited. Combined usage of the TCRB and TCRG tubes detects virtually all clonal T-cell populations, whereas the TCRD tube has added value in case of TCRgammadelta(+) T-cell proliferations. The BIOMED-2 multiplex tubes can now be used for diagnostic clonality studies as well as for the identification of PCR targets suitable for the detection of minimal residual disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical exercise programmes are routinely prescribed in clinical practice to treat impairments, improve activity and participation in daily life because of their known physiological, health and psychological benefits (RCP, 2009). Progressive resistance exercise is a type of exercise prescribed specifically to improve skeletal muscle strength (Latham et al., 2004). The effectiveness of progressive resistance exercise varies considerably between studies and populations. This thesis focuses on how training parameters influence the delivery of progressive resistance exercise. In order to appropriately evaluate the influence of training parameters, this thesis argues the need to record training performance and the total work completed by participants as prescribed by training protocols. In the first study, participants were taken through a series of protocols differentiated by the intensity and volume of training. Training intensity was defined as a proportion of the mean peak torque achieved during maximal voluntary contractions and was set at 80% and 40% respectively of the MVC mean peak torque. Training volume was defined as the total external work achieved over the training period. Measures of training performance were developed to accurately report the intensity, repetitions and work completed during the training period. A second study evaluated training performance of the training protocols over repeated sessions. These protocols were then applied to 3 stroke survivors. Study 1 found sedentary participants could achieve a differentiated training intensity. Participants completing the high and low intensity protocols trained at 80% and 40% respectively of the MVC mean peak torque. The total work achieved in the high intensity low repetition protocol was lower than the total work achieved in the low intensity high repetition protocol. With repeated practice, study 2 found participants were able to improve in their ability to perform manoeuvres as shown by a reduction in the variation of the mean training intensity achieving total work as specified by the protocol to a lower margin of error. When these protocols were applied to 3 stroke survivors, they were able to achieve the specified training intensity but they were not able to achieve the total work as expected for the protocol. This is likely to be due to an inability in achieving a consistent force throughout the contraction. These results demonstrate evaluation of training characteristics and support the need to record and report training performance characteristics during progressive resistance exercise, including the total work achieved, in order to elucidate the influence of training parameters on progressive resistance exercise. The lack of accurate training performance may partly explain the inconsistencies between studies on optimal training parameters for progressive resistance exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis focuses on the private membership test (PMT) problem and presents three single server protocols to resolve this problem. In the presented solutions, a client can perform an inclusion test for some record x in a server's database, without revealing his record. Moreover after executing the protocols, the contents of server's database remain secret. In each of these solutions, a different cryptographic protocol is utilized to construct a privacy preserving variant of Bloom filter. The three suggested solutions are slightly different from each other, from privacy perspective and also from complexity point of view. Therefore, their use cases are different and it is impossible to choose one that is clearly the best between all three. We present the software developments of the three protocols by utilizing various pseudocodes. The performance of our implementation is measured based on a real case scenario. This thesis is a spin-off from the Academy of Finland research project "Cloud Security Services".

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cancer remains an undetermined question for modern medicine. Every year millions of people ranging from children to adult die since the modern treatment is unable to meet the challenge. Research must continue in the area of new biomarkers for tumors. Molecular biology has evolved during last years; however, this knowledge has not been applied into the medicine. Biological findings should be used to improve diagnostics and treatment modalities. In this thesis, human formalin-fixed paraffin embedded colorectal and breast cancer samples were used to optimize the double immunofluorescence staining protocol. Also, immunohistochemistry was performed in order to visualize expression patterns of each biomarker. Concerning double immunofluorescence, feasibility of primary antibodies raised in different and same host species was also tested. Finally, established methods for simultaneous multicolor immunofluorescence imaging of formalin-fixed paraffin embedded specimens were applied for the detection of pairs of potential biomarkers of colorectal cancer (EGFR, pmTOR, pAKT, Vimentin, Cytokeratin Pan, Ezrin, E-cadherin) and breast cancer (Securin, PTTG1IP, Cleaved caspase 3, ki67).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended formulation of a polyhedron P is a linear description of a polyhedron Q together with a linear map π such that π(Q)=P. These objects are of fundamental importance in polyhedral combinatorics and optimization theory, and the subject of a number of studies. Yannakakis’ factorization theorem (Yannakakis in J Comput Syst Sci 43(3):441–466, 1991) provides a surprising connection between extended formulations and communication complexity, showing that the smallest size of an extended formulation of $$P$$P equals the nonnegative rank of its slack matrix S. Moreover, Yannakakis also shows that the nonnegative rank of S is at most 2c, where c is the complexity of any deterministic protocol computing S. In this paper, we show that the latter result can be strengthened when we allow protocols to be randomized. In particular, we prove that the base-2 logarithm of the nonnegative rank of any nonnegative matrix equals the minimum complexity of a randomized communication protocol computing the matrix in expectation. Using Yannakakis’ factorization theorem, this implies that the base-2 logarithm of the smallest size of an extended formulation of a polytope P equals the minimum complexity of a randomized communication protocol computing the slack matrix of P in expectation. We show that allowing randomization in the protocol can be crucial for obtaining small extended formulations. Specifically, we prove that for the spanning tree and perfect matching polytopes, small variance in the protocol forces large size in the extended formulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transmitting sensitive data over non-secret channels has always required encryption technologies to ensure that the data arrives without exposure to eavesdroppers. The Internet has made it possible to transmit vast volumes of data more rapidly and cheaply and to a wider audience than ever before. At the same time, strong encryption makes it possible to send data securely, to digitally sign it, to prove it was sent or received, and to guarantee its integrity. The Internet and encryption make bulk transmission of data a commercially viable proposition. However, there are implementation challenges to solve before commercial bulk transmission becomes mainstream. Powerful have a performance cost, and may affect quality of service. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Performance degradation and potential for commercial loss discourage the bulk transmission of data over the Internet in any commercial application. This paper outlines technical solutions to these problems. We develop new technologies and combine existing ones in new and powerful ways to minimise commercial loss without compromising performance or inflating overheads.