946 resultados para incremental computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determination of an 'anaerobic threshold' plays an important role in the appreciation of an incremental cardiopulmonary exercise test and describes prominent changes of blood lactate accumulation with increasing workload. Two lactate thresholds are discerned during cardiopulmonary exercise testing and used for physical fitness estimation or training prescription. A multitude of different terms are, however, found in the literature describing the two thresholds. Furthermore, the term 'anaerobic threshold' is synonymously used for both, the 'first' and the 'second' lactate threshold, bearing a great potential of confusion. The aim of this review is therefore to order terms, present threshold concepts, and describe methods for lactate threshold determination using a three-phase model with reference to the historical and physiological background to facilitate the practical application of the term 'anaerobic threshold'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Highly available software systems occasionally need to be updated while avoiding downtime. Dynamic software updates reduce down-time, but still require the system to reach a quiescent state in which a global update can be performed. This can be difficult for multi-threaded systems. We present a novel approach to dynamic updates using first-class contexts, called Theseus. First-class contexts make global updates unnecessary: existing threads run to termination in an old context, while new threads start in a new, updated context; consistency between contexts is ensured with the help of bidirectional transformations. We show that for multi-threaded systems with coherent memory, first-class contexts offer a practical and flexible approach to dynamic updates, with acceptable overhead.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Na(+)/Ca(2+) exchangers (NCX) constitute a major Ca(2+) export system that facilitates the re-establishment of cytosolic Ca(2+) levels in many tissues. Ca(2+) interactions at its Ca(2+) binding domains (CBD1 and CBD2) are essential for the allosteric regulation of Na(+)/Ca(2+) exchange activity. The structure of the Ca(2+)-bound form of CBD1, the primary Ca(2+) sensor from canine NCX1, but not the Ca(2+)-free form, has been reported, although the molecular mechanism of Ca(2+) regulation remains unclear. Here, we report crystal structures for three distinct Ca(2+) binding states of CBD1 from CALX, a Na(+)/Ca(2+) exchanger found in Drosophila sensory neurons. The fully Ca(2+)-bound CALX-CBD1 structure shows that four Ca(2+) atoms bind at identical Ca(2+) binding sites as those found in NCX1 and that the partial Ca(2+) occupancy and apoform structures exhibit progressive conformational transitions, indicating incremental regulation of CALX exchange by successive Ca(2+) binding at CBD1. The structures also predict that the primary Ca(2+) pair plays the main role in triggering functional conformational changes. Confirming this prediction, mutagenesis of Glu(455), which coordinates the primary Ca(2+) pair, produces dramatic reductions of the regulatory Ca(2+) affinity for exchange current, whereas mutagenesis of Glu(520), which coordinates the secondary Ca(2+) pair, has much smaller effects. Furthermore, our structures indicate that Ca(2+) binding only enhances the stability of the Ca(2+) binding site of CBD1 near the hinge region while the overall structure of CBD1 remains largely unaffected, implying that the Ca(2+) regulatory function of CBD1, and possibly that for the entire NCX family, is mediated through domain interactions between CBD1 and the adjacent CBD2 at this hinge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is great demand for easily-accessible, user-friendly dietary self-management applications. Yet accurate, fully-automatic estimation of nutritional intake using computer vision methods remains an open research problem. One key element of this problem is the volume estimation, which can be computed from 3D models obtained using multi-view geometry. The paper presents a computational system for volume estimation based on the processing of two meal images. A 3D model of the served meal is reconstructed using the acquired images and the volume is computed from the shape. The algorithm was tested on food models (dummy foods) with known volume and on real served food. Volume accuracy was in the order of 90 %, while the total execution time was below 15 seconds per image pair. The proposed system combines simple and computational affordable methods for 3D reconstruction, remained stable throughout the experiments, operates in near real time, and places minimum constraints on users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Well-known data mining algorithms rely on inputs in the form of pairwise similarities between objects. For large datasets it is computationally impossible to perform all pairwise comparisons. We therefore propose a novel approach that uses approximate Principal Component Analysis to efficiently identify groups of similar objects. The effectiveness of the approach is demonstrated in the context of binary classification using the supervised normalized cut as a classifier. For large datasets from the UCI repository, the approach significantly improves run times with minimal loss in accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pre-combined SLR-GNSS solutions are studied and the impact of different types of datum definition on the estimated parameters is assessed. It is found that the origin is realized best by using only the SLR core network for defining the geodetic datum and the inclusion of the GNSS core sites degrades the origin. The orientation, however, requires a dense and continuous network, thus, the inclusion of the GNSS core network is absolutely needed.