979 resultados para dynamic source routing
Resumo:
Inconsistencies about dynamic asymmetry between the on- and off-transient responses in .VO2 are found in the literature. Therefore the purpose of this study was to examine .VO2on- and off-transients during moderate- and heavy-intensity cycling exercise in trained subjects. Ten men underwent an initial incremental test for the estimation of ventilatory threshold (VT) and, on different days, two bouts of square-wave exercise at moderate (<VT) and heavy (>VT) intensities. .VO2 kinetics in exercise and recovery were better described by a single exponential model (<VT) or by a double exponential with two time delays (>VT). For moderate exercise, we found a symmetry of .VO2 kinetics between the on- and off-transients (i.e., fundamental component), consistent with a system manifesting linear control dynamics. For heavy exercise, a slow component superimposed on the fundamental phase was expressed in both the exercise and recovery, with similar parameter estimates. But the on-transient values of the time constant were appreciably faster than the associated off-transient, and independent of the work rate imposed (<VT and >VT). Our results do not support a dynamically linear system model of .VO2 during cycling exercise in the heavy-intensity domain.
Resumo:
Natural products have long been providing important drug leads for infectious diseases. Leishmaniasis is a protozoan parasitic disease found mainly in developing countries, and it has toxic therapies with few alternatives. Fungal infections have been the main cause of death in immunocompromised patients and new drugs are urgently needed. In this work, a total of 16 plant species belonging to 11 families, selected on an ethnopharmacological basis, were analyzed in vitro against Leishmania (L.) chagasi, Leishmania (L.) amazonensis, Candida krusei, and C. parapsilosis. Of these plant species, seven showed antifungal activity against C. krusei, five showed antileishmanial activity against L. chagasi and four against L. amazonensis, among them species of genus Plectranthus. Our findings confirm the traditional therapeutic use of these plants in the treatment of infectious and inflammatory disorders and also offer insights into the isolation of active and novel drug prototypes, especially those used against neglected diseases as Leishmaniasis.
Resumo:
This case study introduces our continuous work to enhance the virtual classroom in order to provide faculty and students with an environment open to their needs, compliant with learning standards and, therefore compatible with other e-learning environments, and based on open source software. The result is a modulable, sustainable and interoperable learning environment that can be adapted to different teaching and learning situations by incorporating the LMS integrated tools as well as wikis, blogs, forums and Moodle activities among others.
Resumo:
Smarthistory.org is a proven, sustainable model for open educational resources in the Humanities. We discuss lessons learned during its agile development. Smarthistory.org is a free, creative-commons licensed, multi-media web-book designed as a dynamic enhancement or substitute for the traditional art history textbook. It uses conversation instead of the impersonal voice of the typical textbook in-order to reveal disagreement, emotion, and the experience of looking. The listener remains engaged with both the content and the interaction of the speakers. These conversations model close looking and a willingness to encounter and engage the unfamiliar. Smarthistory takes the inherent dialogic and multimedia nature of the web and uses it as a pedagogical method. This extendable Humanities framework uses an open-source content management system making Smarthistory inexpensive to create, and easy to manage and update. Its chronological timeline/chapter-based format integrates new contributions into a single historical framework, a structure applicable across the Humanities.
Resumo:
The Internet is a fundamental part of the daily life of adolescents, they consider it as a safe and confidential source of information on health matters. The aims is to describe the experience of Spanish adolescents searching for health information on the Internet. Methods A cross-sectional study of 811 school-age adolescents in Granada was carried out. An adapted and piloted questionnaire was used which was controlled by trained personnel. Sociodemographic and health variables were included together with those concerning the conditions governing access to and use of information and communication technologies (ICT). Results 811 adolescents were surveyed (99.38% response rate), mean age was 17 years old. Of these, 88% used the Internet; 57.5% used it on a daily or weekly basis and 38.7% used it occasionally. Nearly half the sample group (55.7%) stated that they used the Internet to search for health-related information. The main problems reported in the search for e-health were the ignorance of good web pages (54.8%) and the lack of confidence or search skills (23.2%). Conclusions In conclusion, it seems plausible to claim that websites designed and managed by health services should have a predominant position among interventions specifically addressed to young people.
Resumo:
Background. The use of hospital discharge administrative data (HDAD) has been recommended for automating, improving, even substituting, population-based cancer registries. The frequency of false positive and false negative cases recommends local validation. Methods. The aim of this study was to detect newly diagnosed, false positive and false negative cases of cancer from hospital discharge claims, using four Spanish population-based cancer registries as the gold standard. Prostate cancer was used as a case study. Results. A total of 2286 incident cases of prostate cancer registered in 2000 were used for validation. In the most sensitive algorithm (that using five diagnostic codes), estimates for Sensitivity ranged from 14.5% (CI95% 10.3-19.6) to 45.7% (CI95% 41.4-50.1). In the most predictive algorithm (that using five diagnostic and five surgical codes) Positive Predictive Value estimates ranged from 55.9% (CI95% 42.4-68.8) to 74.3% (CI95% 67.0-80.6). The most frequent reason for false positive cases was the number of prevalent cases inadequately considered as newly diagnosed cancers, ranging from 61.1% to 82.3% of false positive cases. The most frequent reason for false negative cases was related to the number of cases not attended in hospital settings. In this case, figures ranged from 34.4% to 69.7% of false negative cases, in the most predictive algorithm. Conclusions. HDAD might be a helpful tool for cancer registries to reach their goals. The findings suggest that, for automating cancer registries, algorithms combining diagnoses and procedures are the best option. However, for cancer surveillance purposes, in those cancers like prostate cancer in which care is not only hospital-based, combining inpatient and outpatient information will be required.
Resumo:
Bacteria isolated from marine sponges found off the coast of Rio de Janeiro, Brazil, were screened for the production of antimicrobial substances. We report a new Pseudomonas putida strain (designated P. putida Mm3) isolated from the sponge Mycale microsigmatosa that produces a powerful antimicrobial substance active against multidrug-resistant bacteria. P. putida Mm3 was identified on the basis of 16S rRNA gene sequencing and phenotypic tests. Molecular typing for Mm3 was performed by RAPD-PCR and comparison of the results to other Pseudomonas strains. Our results contribute to the search for new antimicrobial agents, an important strategy for developing alternative therapies to treat infections caused by multidrug-resistant bacteria.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
IP based networks still do not have the required degree of reliability required by new multimedia services, achieving such reliability will be crucial in the success or failure of the new Internet generation. Most of existing schemes for QoS routing do not take into consideration parameters concerning the quality of the protection, such as packet loss or restoration time. In this paper, we define a new paradigm to develop new protection strategies for building reliable MPLS networks, based on what we have called the network protection degree (NPD). This NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability and an a posteriori evaluation, the failure impact degree (FID), to determine the impact on the network in case of failure. Having mathematical formulated these components, we point out the most relevant components. Experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms to offer a certain degree of protection
Resumo:
One of the most effective techniques offering QoS routing is minimum interference routing. However, it is complex in terms of computation time and is not oriented toward improving the network protection level. In order to include better levels of protection, new minimum interference routing algorithms are necessary. Minimizing the failure recovery time is also a complex process involving different failure recovery phases. Some of these phases depend completely on correct routing selection, such as minimizing the failure notification time. The level of protection also involves other aspects, such as the amount of resources used. In this case shared backup techniques should be considered. Therefore, minimum interference techniques should also be modified in order to include sharing resources for protection in their objectives. These aspects are reviewed and analyzed in this article, and a new proposal combining minimum interference with fast protection using shared segment backups is introduced. Results show that our proposed method improves both minimization of the request rejection ratio and the percentage of bandwidth allocated to backup paths in networks with low and medium protection requirements
Resumo:
In this paper, a method for enhancing current QoS routing methods by means of QoS protection is presented. In an MPLS network, the segments (links) to be protected are predefined and an LSP request involves, apart from establishing a working path, creating a specific type of backup path (local, reverse or global). Different QoS parameters, such as network load balancing, resource optimization and minimization of LSP request rejection should be considered. QoS protection is defined as a function of QoS parameters, such as packet loss, restoration time, and resource optimization. A framework to add QoS protection to many of the current QoS routing algorithms is introduced. A backup decision module to select the most suitable protection method is formulated and different case studies are analyzed
Resumo:
Due to the high cost of a large ATM network working up to full strength to apply our ideas about network management, i.e., dynamic virtual path (VP) management and fault restoration, we developed a distributed simulation platform for performing our experiments. This platform also had to be capable of other sorts of tests, such as connection admission control (CAC) algorithms, routing algorithms, and accounting and charging methods. The platform was posed as a very simple, event-oriented and scalable simulation. The main goal was the simulation of a working ATM backbone network with a potentially large number of nodes (hundreds). As research into control algorithms and low-level, or rather cell-level methods, was beyond the scope of this study, the simulation took place at a connection level, i.e., there was no real traffic of cells. The simulated network behaved like a real network accepting and rejecting SNMP ones, or experimental tools using the API node