982 resultados para cache coherence protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical exercise programmes are routinely prescribed in clinical practice to treat impairments, improve activity and participation in daily life because of their known physiological, health and psychological benefits (RCP, 2009). Progressive resistance exercise is a type of exercise prescribed specifically to improve skeletal muscle strength (Latham et al., 2004). The effectiveness of progressive resistance exercise varies considerably between studies and populations. This thesis focuses on how training parameters influence the delivery of progressive resistance exercise. In order to appropriately evaluate the influence of training parameters, this thesis argues the need to record training performance and the total work completed by participants as prescribed by training protocols. In the first study, participants were taken through a series of protocols differentiated by the intensity and volume of training. Training intensity was defined as a proportion of the mean peak torque achieved during maximal voluntary contractions and was set at 80% and 40% respectively of the MVC mean peak torque. Training volume was defined as the total external work achieved over the training period. Measures of training performance were developed to accurately report the intensity, repetitions and work completed during the training period. A second study evaluated training performance of the training protocols over repeated sessions. These protocols were then applied to 3 stroke survivors. Study 1 found sedentary participants could achieve a differentiated training intensity. Participants completing the high and low intensity protocols trained at 80% and 40% respectively of the MVC mean peak torque. The total work achieved in the high intensity low repetition protocol was lower than the total work achieved in the low intensity high repetition protocol. With repeated practice, study 2 found participants were able to improve in their ability to perform manoeuvres as shown by a reduction in the variation of the mean training intensity achieving total work as specified by the protocol to a lower margin of error. When these protocols were applied to 3 stroke survivors, they were able to achieve the specified training intensity but they were not able to achieve the total work as expected for the protocol. This is likely to be due to an inability in achieving a consistent force throughout the contraction. These results demonstrate evaluation of training characteristics and support the need to record and report training performance characteristics during progressive resistance exercise, including the total work achieved, in order to elucidate the influence of training parameters on progressive resistance exercise. The lack of accurate training performance may partly explain the inconsistencies between studies on optimal training parameters for progressive resistance exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cancer remains an undetermined question for modern medicine. Every year millions of people ranging from children to adult die since the modern treatment is unable to meet the challenge. Research must continue in the area of new biomarkers for tumors. Molecular biology has evolved during last years; however, this knowledge has not been applied into the medicine. Biological findings should be used to improve diagnostics and treatment modalities. In this thesis, human formalin-fixed paraffin embedded colorectal and breast cancer samples were used to optimize the double immunofluorescence staining protocol. Also, immunohistochemistry was performed in order to visualize expression patterns of each biomarker. Concerning double immunofluorescence, feasibility of primary antibodies raised in different and same host species was also tested. Finally, established methods for simultaneous multicolor immunofluorescence imaging of formalin-fixed paraffin embedded specimens were applied for the detection of pairs of potential biomarkers of colorectal cancer (EGFR, pmTOR, pAKT, Vimentin, Cytokeratin Pan, Ezrin, E-cadherin) and breast cancer (Securin, PTTG1IP, Cleaved caspase 3, ki67).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rate of consumption of alcoholic beverages has undergone changes as well as the factors that influence it. In order to understand the significance of drinking patterns, this study was conducted with a sample of young adults (N = 260) ages 20 to 30, in Lisbon. The instruments used were The Alcohol Use Disorders Identification Test and the Sense of Coherence Questionnaire. The results show that 10.8% had problems with alcohol. Those who had a lower sense of coherence, especially in the dimension of investment capacity, presented with more harmful and risky consumption patterns. We conclude that health promotion behaviors should include measures to strengthen a sense of coherence

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended formulation of a polyhedron P is a linear description of a polyhedron Q together with a linear map π such that π(Q)=P. These objects are of fundamental importance in polyhedral combinatorics and optimization theory, and the subject of a number of studies. Yannakakis’ factorization theorem (Yannakakis in J Comput Syst Sci 43(3):441–466, 1991) provides a surprising connection between extended formulations and communication complexity, showing that the smallest size of an extended formulation of $$P$$P equals the nonnegative rank of its slack matrix S. Moreover, Yannakakis also shows that the nonnegative rank of S is at most 2c, where c is the complexity of any deterministic protocol computing S. In this paper, we show that the latter result can be strengthened when we allow protocols to be randomized. In particular, we prove that the base-2 logarithm of the nonnegative rank of any nonnegative matrix equals the minimum complexity of a randomized communication protocol computing the matrix in expectation. Using Yannakakis’ factorization theorem, this implies that the base-2 logarithm of the smallest size of an extended formulation of a polytope P equals the minimum complexity of a randomized communication protocol computing the slack matrix of P in expectation. We show that allowing randomization in the protocol can be crucial for obtaining small extended formulations. Specifically, we prove that for the spanning tree and perfect matching polytopes, small variance in the protocol forces large size in the extended formulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary objective of the present thesis was to determine the extent of intertextual coherence and inter-filmic discourse retained in the Finnish DVD subtitles of the first twelve feature films set in the Marvel Cinematic Universe and ten episodes of the first season of Agents of S.H.I.E.L.D., a transmedia extension of the MCU. The cinematic world of Marvel was chosen as research data in this study for the inherence, abundance, and conspicuousness of its intertextuality. Two categories through which to retain intertextual coherence in translation were set as the premise of the study: 1) the consistent application of the same form of MCU-related proper names in translation and 2) the retention of MCU-related allusions in translation when the retention of the allusion is a strategic choice. The data was collected and analyzed primarily in this juxtaposition. The examination of the gathered data and the set research questions necessitated the division of audiovisual allusions into three categories: verbal visual allusions, secondary spoken allusions, and primary spoken allusions, the last of which was further divided into ambiguous and unambiguous types. Because of their qualitative inadequacies, unambiguous primary spoken allusions were not eligible as data in the present study. 33.3 percent of the proper names qualified as data were translated consistently in each installment they were referenced. In terms of allusions, 76.2 percent of the qualified source-text instances were retained in translation. The results indicate that intertextual elements are more easily identified and retained within the context of one narrative than when this requires the observation of multiple connected narratives as one interwoven universe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transmitting sensitive data over non-secret channels has always required encryption technologies to ensure that the data arrives without exposure to eavesdroppers. The Internet has made it possible to transmit vast volumes of data more rapidly and cheaply and to a wider audience than ever before. At the same time, strong encryption makes it possible to send data securely, to digitally sign it, to prove it was sent or received, and to guarantee its integrity. The Internet and encryption make bulk transmission of data a commercially viable proposition. However, there are implementation challenges to solve before commercial bulk transmission becomes mainstream. Powerful have a performance cost, and may affect quality of service. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Performance degradation and potential for commercial loss discourage the bulk transmission of data over the Internet in any commercial application. This paper outlines technical solutions to these problems. We develop new technologies and combine existing ones in new and powerful ways to minimise commercial loss without compromising performance or inflating overheads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.