998 resultados para 13368-014


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and purpose: Accelerated partial breast irradiation (APBI) is the strategy that allows adjuvant treatment delivery in a shorter period of time in smaller volumes. This study was undertaken to assess the effectiveness and outcomes of APBI in breast cancer compared with whole-breast irradiation (WBI). Material and methods: Systematic review and meta-analysis of randomized controlled trials of WBI versus APBI. Two authors independently selected and assessed the studies regarding eligibility criteria. Results: Eight studies were selected. A total of 8653 patients were randomly assigned for WBI versus APBI. Six studies reported local recurrence outcomes. Two studies were matched in 5 years and only one study for different time of follow-up. Meta-analysis of two trials assessing 1407 participants showed significant difference in the WBI versus APBI group regarding the 5-year local recurrence rate (HR = 4.54, 95% CI: 1.78-11.61, p = 0.002). Significant difference in favor of WBI for different follow-up times was also found. No differences in nodal recurrence, systemic recurrence, overall survival and mortality rates were observed. Conclusions: APBI is associated with higher local recurrence compared to WBI without compromising other clinical outcomes. (C) 2014 Elsevier Ireland Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On a Dreyfusian account performers choke when they reflect upon and interfere with established routines of purely embodied expertise. This basic explanation of choking remains popular even today and apparently enjoys empirical support. Its driving insight can be understood through the lens of diverse philosophical visions of the embodied basis of expertise. These range from accounts of embodied cognition that are ultra conservative with respect to representational theories of cognition to those that are more radically embodied. This paper provides an account of the acquisition of embodied expertise, and explanation of the choking effect, from the most radically enactive, embodied perspective, spelling out some of its practical implications and addressing some possible philosophical challenges. Specifically, we propose: (i) an explanation of how skills can be acquired on the basis of ecological dynamics; and (ii) a non-linear pedagogy that takes into account how contentful representations might scaffold skill acquisition from a radically enactive perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

McArdle disease is arguably the paradigm of exercise intolerance in humans. This disorder is caused by inherited deficiency of myophosphorylase, the enzyme isoform that initiates glycogen breakdown in skeletal muscles. Because patients are unable to obtain energy from their muscle glycogen stores, this disease provides an interesting model of study for exercise physiologists, allowing insight to be gained into the understanding of glycogen-dependent muscle functions. Of special interest in the field of muscle physiology and sports medicine are also some specific (if not unique) characteristics of this disorder, such as the so-called 'second wind' phenomenon, the frequent exercise-induced rhabdomyolysis and myoglobinuria episodes suffered by patients (with muscle damage also occurring under basal conditions), or the early appearance of fatigue and contractures, among others. In this article we review the main pathophysiological features of this disorder leading to exercise intolerance as well as the currently available therapeutic possibilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gillmore, G. Gilbertson, D. Grattan, J. Hunt, C. McLaren, S. Pyatt, B. Banda, R. Barker, G. Denman, A. Phillips, P. Reynolds, T. The potential risk from 222radon posed to archaeologists and earth scientists: reconnaissance study of radon concentrations, excavations and archaeological shelters in the Great cave of Niah, Sarawak, Malaysia. Ecotoxicology and Environmental Safety. 2005. 60 pp 213-227.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

R. Jensen and Q. Shen, 'Fuzzy-Rough Data Reduction with Ant Colony Optimization,' Fuzzy Sets and Systems, vol. 149, no. 1, pp. 5-20, 2005.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

R. Zwiggelaar, T.C. Parr, J.E. Schumm. I.W. Hutt, S.M. Astley, C.J. Taylor and C.R.M. Boggis, 'Model-based detection of spiculated lesions in mammograms', Medical Image Analysis 3 (1), 39-62 (1999)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Davison G, Gleeson M, 2006. The effect of 2 weeks vitamin C supplementation on immunoendocrine responses to 2.5 h cycling exercise in man. European Journal of Applied Physiology 97(4): 454-461 RAE2008

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyses the asymptotic properties of nonlinear least squares estimators of the long run parameters in a bivariate unbalanced cointegration framework. Unbalanced cointegration refers to the situation where the integration orders of the observables are different, but their corresponding balanced versions (with equal integration orders after filtering) are cointegrated in the usual sense. Within this setting, the long run linkage between the observables is driven by both the cointegrating parameter and the difference between the integration orders of the observables, which we consider to be unknown. Our results reveal three noticeable features. First, superconsistent (faster than √ n-consistent) estimators of the difference between memory parameters are achievable. Next, the joint limiting distribution of the estimators of both parameters is singular, and, finally, a modified version of the ‘‘Type II’’ fractional Brownian motion arises in the limiting theory. A Monte Carlo experiment and the discussion of an economic example are included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an algorithm which extends the relatively new notion of speculative concurrency control by delaying the commitment of transactions, thus allowing other conflicting transactions to continue execution and commit rather than restart. This algorithm propagates uncommitted data to other outstanding transactions thus allowing more speculative schedules to be considered. The algorithm is shown always to find a serializable schedule, and to avoid cascading aborts. Like speculative concurrency control, it considers strictly more schedules than traditional concurrency control algorithms. Further work is needed to determine which of these speculative methods performs better on actual transaction loads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two new notions of reduction for terms of the λ-calculus are introduced and the question of whether a λ-term is beta-strongly normalizing is reduced to the question of whether a λ-term is merely normalizing under one of the new notions of reduction. This leads to a new way to prove beta-strong normalization for typed λ-calculi. Instead of the usual semantic proof style based on Girard's "candidats de réductibilité'', termination can be proved using a decreasing metric over a well-founded ordering in a style more common in the field of term rewriting. This new proof method is applied to the simply-typed λ-calculus and the system of intersection types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extensible systems allow services to be configured and deployed for the specific needs of individual applications. This paper describes a safe and efficient method for user-level extensibility that requires only minimal changes to the kernel. A sandboxing technique is described that supports multiple logical protection domains within the same address space at user-level. This approach allows applications to register sandboxed code with the system, that may be executed in the context of any process. Our approach differs from other implementations that require special hardware support, such as segmentation or tagged translation look-aside buffers (TLBs), to either implement multiple protection domains in a single address space, or to support fast switching between address spaces. Likewise, we do not require the entire system to be written in a type-safe language, to provide fine-grained protection domains. Instead, our user-level sandboxing technique requires only paged-based virtual memory support, and the requirement that extension code is written either in a type-safe language, or by a trusted source. Using a fast method of upcalls, we show how our sandboxing technique for implementing logical protection domains provides significant performance improvements over traditional methods of invoking user-level services. Experimental results show our approach to be an efficient method for extensibility, with inter-protection domain communication costs close to those of hardware-based solutions leveraging segmentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BoostMap is a recently proposed method for efficient approximate nearest neighbor retrieval in arbitrary non-Euclidean spaces with computationally expensive and possibly non-metric distance measures. Database and query objects are embedded into a Euclidean space, in which similarities can be rapidly measured using a weighted Manhattan distance. The key idea is formulating embedding construction as a machine learning task, where AdaBoost is used to combine simple, 1D embeddings into a multidimensional embedding that preserves a large amount of the proximity structure of the original space. This paper demonstrates that, using the machine learning formulation of BoostMap, we can optimize embeddings for indexing and classification, in ways that are not possible with existing alternatives for constructive embeddings, and without additional costs in retrieval time. First, we show how to construct embeddings that are query-sensitive, in the sense that they yield a different distance measure for different queries, so as to improve nearest neighbor retrieval accuracy for each query. Second, we show how to optimize embeddings for nearest neighbor classification tasks, by tuning them to approximate a parameter space distance measure, instead of the original feature-based distance measure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is an addendum to our technical report BUCS TR-94-014 of December 19, 1994. It clarifies some statements, adds information on some related research, includes a comparison with research be de Groote, and fixes two minor mistakes in a proof.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As distributed information services like the World Wide Web become increasingly popular on the Internet, problems of scale are clearly evident. A promising technique that addresses many of these problems is service (or document) replication. However, when a service is replicated, clients then need the additional ability to find a "good" provider of that service. In this paper we report on techniques for finding good service providers without a priori knowledge of server location or network topology. We consider the use of two principal metrics for measuring distance in the Internet: hops, and round-trip latency. We show that these two metrics yield very different results in practice. Surprisingly, we show data indicating that the number of hops between two hosts in the Internet is not strongly correlated to round-trip latency. Thus, the distance in hops between two hosts is not necessarily a good predictor of the expected latency of a document transfer. Instead of using known or measured distances in hops, we show that the extra cost at runtime incurred by dynamic latency measurement is well justified based on the resulting improved performance. In addition we show that selection based on dynamic latency measurement performs much better in practice that any static selection scheme. Finally, the difference between the distribution of hops and latencies is fundamental enough to suggest differences in algorithms for server replication. We show that conclusions drawn about service replication based on the distribution of hops need to be revised when the distribution of latencies is considered instead.