911 resultados para Lubrication and cooling techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

During thermo regulation in the bearded dragon Pogona barbata, heart rate when heating is significantly faster than when cooling at any given body temperature (heart rate hysteresis), resulting in faster rates of heating than cooling. However, the mechanisms that control heart rate during heating and cooling are unknown. The aim of this study was to test the hypothesis that changes in cholinergic and adrenergic tone on the heart are responsible for the heart rate hysteresis during heating and cooling in P. barbata. Heating and cooling trials were conducted before and after the administration of atropine, a muscarinic antagonist, and sotalol, a beta-adrenergic antagonist. Cholinergic and beta-adrenergic blockade did not abolish the heart rate hysteresis, as the heart rate during heating was significantly faster than during cooling in all cases. Adrenergic tone was extremely high (92.3%) at the commencement of heating, and decreased to 30.7% at the end of the cooling period. Moreover, in four lizards there was an instantaneous drop in heart rate (up to 15 beats min(-1)) as the heat source was switched off, and this drop in heart rate coincided with either a drop in beta-adrenergic tone or an increase in cholinergic tone. Rates of heating were significantly faster during the cholinergic blockade, and least with a combined cholinergic and beta-adrenergic blockade. The results showed that cholinergic and beta-adrenergic systems are not the only control mechanisms acting on the heart during heating and cooling, but they do have a significant effect on heart rate and on rates of heating and cooling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teaching the PSP: Challenges and Lessons Learned by Jurgen Borstler, David Carrington, Gregory W Hislop, Susan Lisack, Keith Olson, and Laurie Williams, pp. 42-48. Soft-ware engineering educators need to provide environments where students learn about the size and complexity of modern software systems and the techniques available for managing these difficulties. Five universities used the Personal Software Process to teach software engineering concepts in a variety of contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Arguably the most complex conical functions are seated in human cognition, the how and why of which have been debated for centuries by theologians, philosophers and scientists alike. In his best-selling book, An Astonishing Hypothesis: A Scientific Search for the Soul, Francis Crick refined the view that these qualities are determined solely by cortical cells and circuitry. Put simply, cognition is nothing more, or less, than a biological function. Accepting this to be the case, it should be possible to identify the mechanisms that subserve cognitive processing. Since the pioneering studies of Lorent de No and Hebb, and the more recent studies of Fuster, Miller and Goldman-Rakic, to mention but a few, much attention has been focused on the role of persistent neural activity in cognitive processes. Application of modern technologies and modelling techniques has led to new hypotheses about the mechanisms of persistent activity. Here I focus on how regional variations in the pyramidal cell phenotype may determine the complexity of cortical circuitry and, in turn, influence neural activity. Data obtained from thousands of individually injected pyramidal cells in sensory, motor, association and executive cortex reveal marked differences in the numbers of putative excitatory inputs received by these cells. Pyramidal cells in prefrontal cortex have, on average, up to 23 times more dendritic spines than those in the primary visual area. I propose that without these specializations in the structure of pyramidal cells, and the circuits they form, human cognitive processing would not have evolved to its present state. I also present data from both New World and Old World monkeys that show varying degrees of complexity in the pyramidal cell phenotype in their prefrontal cortices, suggesting that cortical circuitry and, thus, cognitive styles are evolving independently in different species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Color model representation allows characterizing in a quantitative manner, any defined color spectrum of visible light, i.e. with a wavelength between 400nm and 700nm. To accomplish that, each model, or color space, is associated with a function that allows mapping the spectral power distribution of the visible electromagnetic radiation, in a space defined by a set of discrete values that quantify the color components composing the model. Some color spaces are sensitive to changes in lighting conditions. Others assure the preservation of certain chromatic features, remaining immune to these changes. Therefore, it becomes necessary to identify the strengths and weaknesses of each model in order to justify the adoption of color spaces in image processing and analysis techniques. This chapter will address the topic of digital imaging, main standards and formats. Next we will set the mathematical model of the image acquisition sensor response, which enables assessment of the various color spaces, with the aim of determining their invariance to illumination changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Purpose: Precise needle puncture of the kidney is a challenging and essential step for successful percutaneous nephrolithotomy (PCNL). Many devices and surgical techniques have been developed to easily achieve suitable renal access. This article presents a critical review to address the methodologies and techniques for conducting kidney targeting and the puncture step during PCNL. Based on this study, research paths are also provided for PCNL procedure improvement. Methods: Most relevant works concerning PCNL puncture were identified by a search of Medline/PubMed, ISI Web of Science, and Scopus databases from 2007 to December 2012. Two authors independently reviewed the studies. Results: A total of 911 abstracts and 346 full-text articles were assessed and discussed; 52 were included in this review as a summary of the main contributions to kidney targeting and puncturing. Conclusions: Multiple paths and technologic advances have been proposed in the field of urology and minimally invasive surgery to improve PCNL puncture. The most relevant contributions, however, have been provided by the applicationofmedical imaging guidance, newsurgical tools,motion tracking systems, robotics, andimage processing and computer graphics. Despite the multiple research paths for PCNL puncture guidance, no widely acceptable solution has yet been reached, and it remains an active and challenging research field. Future developments should focus on real-time methods, robust and accurate algorithms, and radiation free imaging techniques

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Precise needle puncture of the kidney is a challenging and essential step for successful percutaneous nephrolithotomy (PCNL). Many devices and surgical techniques have been developed to easily achieve suitable renal access. This article presents a critical review to address the methodologies and techniques for conducting kidney targeting and the puncture step during PCNL. Based on this study, research paths are also provided for PCNL procedure improvement. Methods: Most relevant works concerning PCNL puncture were identified by a search of Medline/PubMed, ISI Web of Science, and Scopus databases from 2007 to December 2012. Two authors independently reviewed the studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phenolic compounds have been extensively studied in recent years. The presence of these compounds in various foods has been associated with sensory and health promoting properties. These products from the secondary metabolism of plants act as defense mechanisms against environmental stress and attack by other organisms. They are divided into different classes according to their chemical structures. The objective of this study was to describe the different classes of phenolic compounds, the main food sources and factors of variation, besides methods for the identification and quantification commonly used to analyze these compounds. Moreover, the role of phenolic compounds in scavenging oxidative stress and the techniques of in vitro antioxidant evaluation are discussed. In vivo studies to evaluate the biological effects of these compounds and their impact on chronic disease prevention are presented as well. Finally, it was discussed the role of these compounds on the sensory quality of foods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic recordings of IRIS/IDA/GSN station CMLA and of several temporary stations in the Azores archipelago are processed with P and S receiver function (PRF and SRF) techniques. Contrary to regional seismic tomography these methods provide estimates of the absolute velocities and of the Vp/Vs ratio up to a depth of similar to 300 km. Joint inversion of PRFs and SRFs for a few data sets consistently reveals a division of the subsurface medium into four zones with a distinctly different Vp/Vs ratio: the crust similar to 20 km thick with a ratio of similar to 1.9 in the lower crust, the high-Vs mantle lid with a strongly reduced VpNs velocity ratio relative to the standard 1.8, the low-velocity zone (LVZ) with a velocity ratio of similar to 2.0, and the underlying upper-mantle layer with a standard velocity ratio. Our estimates of crustal thickness greatly exceed previous estimates (similar to 10 km). The base of the high-Vs lid (the Gutenberg discontinuity) is at a depth of-SO km. The LVZ with a reduction of S velocity of similar to 15% relative to the standard (IASP91) model is terminated at a depth of similar to 200 km. The average thickness of the mantle transition zone (TZ) is evaluated from the time difference between the S410p and SKS660p, seismic phases that are robustly detected in the S and SKS receiver functions. This thickness is practically similar to the standard IASP91 value of 250 km. and is characteristic of a large region of the North Atlantic outside the Azores plateau. Our data are indicative of a reduction of the S-wave velocity of several percent relative to the standard velocity in a depth interval from 460 to 500 km. This reduction is found in the nearest vicinities of the Azores, in the region sampled by the PRFs, but, as evidenced by SRFs, it is missing at a distance of a few hundred kilometers from the islands. We speculate that this anomaly may correspond to the source of a plume which generated the Azores hotspot. Previously, a low S velocity in this depth range was found with SRF techniques beneath a few other hotspots.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are complex and diverse methodological problems involved in the clinical and epidemiological study of respiratory diseases and their etiological factors. The association of urban growth, industrialization and environmental deterioration with respiratory diseases makes it necessary to pay more attention to this research area with a multidisciplinary approach. Appropriate study designs and statistical techniques to analyze and improve our understanding of the pathological events and their causes must be implemented to reduce the growing morbidity and mortality through better preventive actions and health programs. The objective of the article is to review the most common methodological problems in this research area and to present the most available statistical tools used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new methodology for the creation and management of coalitions in Electricity Markets. This approach is tested using the multi-agent market simulator MASCEM, taking advantage of its ability to provide the means to model and simulate VPP (Virtual Power Producers). VPPs are represented as coalitions of agents, with the capability of negotiating both in the market, and internally, with their members, in order to combine and manage their individual specific characteristics and goals, with the strategy and objectives of the VPP itself. The new features include the development of particular individual facilitators to manage the communications amongst the members of each coalition independently from the rest of the simulation, and also the mechanisms for the classification of the agents that are candidates to join the coalition. In addition, a global study on the results of the Iberian Electricity Market is performed, to compare and analyze different approaches for defining consistent and adequate strategies to integrate into the agents of MASCEM. This, combined with the application of learning and prediction techniques provide the agents with the ability to learn and adapt themselves, by adjusting their actions to the continued evolving states of the world they are playing in.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an imminent need for rapid methods to detect and determine pathogenic bacteria in food products as alternatives to the laborious and time-consuming culture procedures. In this work, an electrochemical immunoassay using iron/gold core/shell nanoparticles (Fe@Au) conjugated with anti-Salmonella antibodies was developed. The chemical synthesis and functionalization of magnetic and gold-coated magnetic nanoparticles is reported. Fe@Au nanoparticles were functionalized with different self-assembled monolayers and characterized using ultraviolet-visible spectrometry, transmission electron microscopy, and voltammetric techniques. The determination of Salmonella typhimurium, on screen-printed carbon electrodes, was performed by square-wave anodic stripping voltammetry through the use of CdS nanocrystals. The calibration curve was established between 1×101 and 1×106 cells/mL and the limit of detection was 13 cells/mL. The developed method showed that it is possible to determine the bacteria in milk at low concentrations and is suitable for the rapid (less than 1 h) and sensitive detection of S. typhimurium in real samples. Therefore, the developed methodology could contribute to the improvement of the quality control of food samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Total petroleum hydrocarbons (TPH) are important environmental contaminants which are toxic to human and environmental receptors. Several analytical methods have been used to quantify TPH levels in contaminated soils, specifically through infrared spectrometry (IR) and gas chromatography (GC). Despite being two of the most used techniques, some issues remain that have been inadequately studied: a) applicability of both techniques to soils contaminated with two distinct types of fuel (petrol and diesel), b) influence of the soil natural organic matter content on the results achieved by various analytical methods, and c) evaluation of the performance of both techniques in analyses of soils with different levels of contamination (presumably non-contaminated and potentially contaminated). The main objectives of this work were to answer these questions and to provide more complete information about the potentials and limitations of GC and IR techniques. The results led us to the following conclusions: a) IR analysis of soils contaminated with petrol is not suitable due to volatilisation losses, b) there is a significant influence of organic matter in IR analysis, and c) both techniques demonstrated the capacity to accurately quantify TPH in soils, irrespective of their contamination levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Localization is a fundamental task in Cyber-Physical Systems (CPS), where data is tightly coupled with the environment and the location where it is generated. The research literature on localization has reached a critical mass, and several surveys have also emerged. This review paper contributes on the state-of-the-art with the proposal of a new and holistic taxonomy of the fundamental concepts of localization in CPS, based on a comprehensive analysis of previous research works and surveys. The main objective is to pave the way towards a deep understanding of the main localization techniques, and unify their descriptions. Furthermore, this review paper provides a complete overview on the most relevant localization and geolocation techniques. Also, we present the most important metrics for measuring the accuracy of localization approaches, which is meant to be the gap between the real location and its estimate. Finally, we present open issues and research challenges pertaining to localization. We believe that this review paper will represent an important and complete reference of localization techniques in CPS for researchers and practitioners and will provide them with an added value as compared to previous surveys.