921 resultados para MVP protocol 2
Resumo:
Pós-graduação em Reabilitação Oral - FOAR
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
INTRODUÇÃO:Testes incrementais de corrida permitem a determinação de limiares metabólicos e neuromusculares. O objetivo do presente estudo foi comparar índices eletromiográficos e metabólicos entre dois protocolos incrementais de corrida com diferentes intervalos entre cada estágio de velocidade.MÉTODOS:Participaram do estudo 14 voluntários do sexo masculino. Os protocolos incrementais de corrida em esteira iniciaram em 8 km.h-1, com incremento de 1 km.h-1 a cada três minutos até a exaustão voluntária. Os dois protocolos diferiram quanto aos intervalos entre cada estágio de velocidade: 30 segundos (protocolo 1) e 120 segundos (protocolo 2). O limiar de fadiga eletromiográfico (LFEMG) foi determinado para os músculos reto femoral, bíceps femoral, tibial anterior e gastrocnêmio lateral. Para tanto, o comportamento do valor RMS foi correlacionado em função do tempo de corrida, sendo realizada regressão linear para determinação dos coeficientes de inclinação. O limiar de lactato foi identificado por meio do ponto de inflexão na curva lactato-intensidade e o limiar anaeróbio foi determinado por meio de interpolação linear. Foi aplicado um teste t de Student para dados pareados (p<0,05).RESULTADOS:Foi verificado que o protocolo 2 apresentou velocidade de LFEMG maior do que o protocolo 1, apenas para o músculo BF (p=0,023), o que caracteriza uma resposta específica deste músculo em protocolos incrementais de corrida.CONCLUSÃO:Protocolos de corrida com intervalos de até dois minutos entre os estágios incrementais apresentaram resultados semelhantes para determinação do LFEMG da maioria dos músculos estudados e dos limiares metabólicos.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Statement of problem Masticatory performance analysis of conventional complete denture wearers who use denture adhesives is scarce in the dental literature. Purpose The purpose of this study was to assess the effect of the use of 2 denture adhesives on the masticatory performance of conventional complete denture wearers by means of a crossover study. Material and methods Forty individuals who were edentulous received new maxillary and mandibular complete dentures, and, after an adaptation period, were submitted to masticatory performance analysis without denture adhesive (control). The participants were randomly divided and assigned to 2 protocols: protocol 1, denture adhesive 1 (Ultra Corega cream tasteless) use during the first 15 days, followed by no use of denture adhesive over the next 15 days (washout), and then use of denture adhesive 2 (Ultra Corega powder tasteless) for 15 days; protocol 2, denture adhesive 2 (Ultra Corega powder tasteless) use during the first 15 days, followed by no use of denture adhesive during the next 15 days (washout), and then use of denture adhesive 1 (Ultra Corega cream tasteless) for 15 days. The masticatory performance was assessed immediately after the use of denture adhesive by means of the sieve method, in which participants were instructed to deliberately chew 5 almonds for 20 chewing strokes. Masticatory performance was calculated by the weight of comminuted material that passed through the sieves. Data were analyzed by a 1-way ANOVA for paired samples and the multiple comparison of means by using the Bonferroni test (α=.05). Results A significant increase in masticatory performance was noted after using the Ultra Corega cream (mean, 32.6%) and Ultra Corega powder (mean, 31.2%) when compared with the control group (mean, 19.8%) (P<.001). No significant difference was found between the 2 denture adhesives evaluated. Conclusion The use of denture adhesive improved the masticatory performance of conventional complete denture wearers. No difference was found in masticatory performance with the use of cream or powder denture adhesive.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The Peer-to-Peer network paradigm is drawing the attention of both final users and researchers for its features. P2P networks shift from the classic client-server approach to a high level of decentralization where there is no central control and all the nodes should be able not only to require services, but to provide them to other peers as well. While on one hand such high level of decentralization might lead to interesting properties like scalability and fault tolerance, on the other hand it implies many new problems to deal with. A key feature of many P2P systems is openness, meaning that everybody is potentially able to join a network with no need for subscription or payment systems. The combination of openness and lack of central control makes it feasible for a user to free-ride, that is to increase its own benefit by using services without allocating resources to satisfy other peers’ requests. One of the main goals when designing a P2P system is therefore to achieve cooperation between users. Given the nature of P2P systems based on simple local interactions of many peers having partial knowledge of the whole system, an interesting way to achieve desired properties on a system scale might consist in obtaining them as emergent properties of the many interactions occurring at local node level. Two methods are typically used to face the problem of cooperation in P2P networks: 1) engineering emergent properties when designing the protocol; 2) study the system as a game and apply Game Theory techniques, especially to find Nash Equilibria in the game and to reach them making the system stable against possible deviant behaviors. In this work we present an evolutionary framework to enforce cooperative behaviour in P2P networks that is alternative to both the methods mentioned above. Our approach is based on an evolutionary algorithm inspired by computational sociology and evolutionary game theory, consisting in having each peer periodically trying to copy another peer which is performing better. The proposed algorithms, called SLAC and SLACER, draw inspiration from tag systems originated in computational sociology, the main idea behind the algorithm consists in having low performance nodes copying high performance ones. The algorithm is run locally by every node and leads to an evolution of the network both from the topology and from the nodes’ strategy point of view. Initial tests with a simple Prisoners’ Dilemma application show how SLAC is able to bring the network to a state of high cooperation independently from the initial network conditions. Interesting results are obtained when studying the effect of cheating nodes on SLAC algorithm. In fact in some cases selfish nodes rationally exploiting the system for their own benefit can actually improve system performance from the cooperation formation point of view. The final step is to apply our results to more realistic scenarios. We put our efforts in studying and improving the BitTorrent protocol. BitTorrent was chosen not only for its popularity but because it has many points in common with SLAC and SLACER algorithms, ranging from the game theoretical inspiration (tit-for-tat-like mechanism) to the swarms topology. We discovered fairness, meant as ratio between uploaded and downloaded data, to be a weakness of the original BitTorrent protocol and we drew inspiration from the knowledge of cooperation formation and maintenance mechanism derived from the development and analysis of SLAC and SLACER, to improve fairness and tackle freeriding and cheating in BitTorrent. We produced an extension of BitTorrent called BitFair that has been evaluated through simulation and has shown the abilities of enforcing fairness and tackling free-riding and cheating nodes.
Resumo:
Understanding the canopy cover of an urban environment leads to better estimates of carbon storage and more informed management decisions by urban foresters. The most commonly used method for assessing urban forest cover type extent is ground surveys, which can be both timeconsuming and expensive. The analysis of aerial photos is an alternative method that is faster, cheaper, and can cover a larger number of sites, but may be less accurate. The objectives of this paper were (1) to compare three methods of cover type assessment for Los Angeles, CA: handdelineation of aerial photos in ArcMap, supervised classification of aerial photos in ERDAS Imagine, and ground-collected data using the Urban Forest Effects (UFORE) model protocol; (2) to determine how well remote sensing methods estimate carbon storage as predicted by the UFORE model; and (3) to explore the influence of tree diameter and tree density on carbon storage estimates. Four major cover types (bare ground, fine vegetation, coarse vegetation, and impervious surfaces) were determined from 348 plots (0.039 ha each) randomly stratified according to land-use. Hand-delineation was better than supervised classification at predicting ground-based measurements of cover type and UFORE model-predicted carbon storage. Most error in supervised classification resulted from shadow, which was interpreted as unknown cover type. Neither tree diameter or tree density per plot significantly affected the relationship between carbon storage and canopy cover. The efficiency of remote sensing rather than in situ data collection allows urban forest managers the ability to quickly assess a city and plan accordingly while also preserving their often-limited budget.
Resumo:
Background: Bernese mountain dogs are reported to have a shorter life expectancy than other breeds. A Major reason for this has been assigned to a high tumour prevalence, especially of histiocytic sarcoma. The efforts made by the breeding clubs to improve the longevity with the help of genetic tests and breeding value estimations are impeded by insufficiently reliable diagnoses regarding the cause of death. The current standard for post mortem examination in animals is performance of an autopsy. In human forensic medicine, imaging modalities, such as computed tomography and magnetic resonance imaging, are used with increasing frequency as a complement to autopsy. The present study investigates, whether post mortem computed tomography in combination with core needle biopsy is able to provide a definitive diagnosis of histiocytic sarcoma. For this purpose we have analysed the results of post mortem computed tomography and core needle biopsy in eleven Bernese mountain dogs. In the subsequent autopsy, every dog had a definitive diagnosis of histiocytic sarcoma, based on immunohistochemistry. Results: Computed tomography revealed space-occupying lesions in all dogs. Lesion detection by post mortem computed tomography was similar to lesion detection in autopsy for lung tissue (9 cases in computed tomography / 8 cases in autopsy), thoracic lymph nodes (9/8), spleen (6/7), kidney (2/2) and bone (3/3). Hepatic nodules, however, were difficult to detect with our scanning protocol (2/7). Histology of the core needle biopsies provided definitive diagnoses of histiocytic sarcoma in ten dogs, including confirmation by immunohistochemistry in six dogs. The biopsy samples of the remaining dog did not contain any identifiable neoplastic cells. Autolysis was the main reason for uncertain histological diagnoses. Conclusions: Post mortem computed tomography is a fast and effective method for the detection of lesions suspicious for histiocytic sarcoma in pulmonary, thoracic lymphatic, splenic, osseous and renal tissue. Optimization of the procedure regarding the scanning protocol and tissue sample size and number will improve the accuracy of the method. Keywords: Post mortem computed tomography, Core needle biopsy, Bernese mountain dog, Histiocytic sarcoma, Autopsy
Resumo:
BACKGROUND Information about the impact of cancer treatments on patients' quality of life (QoL) is of paramount importance to patients and treating oncologists. Cancer trials that do not specify QoL as an outcome or fail to report collected QoL data, omit crucial information for decision making. To estimate the magnitude of these problems, we investigated how frequently QoL outcomes were specified in protocols of cancer trials and subsequently reported. DESIGN Retrospective cohort study of RCT protocols approved by six research ethics committees in Switzerland, Germany, and Canada between 2000 and 2003. We compared protocols to corresponding publications, which were identified through literature searches and investigator surveys. RESULTS Of the 173 cancer trials, 90 (52%) specified QoL outcomes in their protocol, 2 (1%) as primary and 88 (51%) as secondary outcome. Of the 173 trials, 35 (20%) reported QoL outcomes in a corresponding publication (4 modified from the protocol), 18 (10%) were published but failed to report QoL outcomes in the primary or a secondary publication, and 37 (21%) were not published at all. Of the 83 (48%) trials that did not specify QoL outcomes in their protocol, none subsequently reported QoL outcomes. Failure to report pre-specified QoL outcomes was not associated with industry sponsorship (versus non-industry), sample size, and multicentre (versus single centre) status but possibly with trial discontinuation. CONCLUSIONS About half of cancer trials specified QoL outcomes in their protocols. However, only 20% reported any QoL data in associated publications. Highly relevant information for decision making is often unavailable to patients, oncologists, and health policymakers.
Resumo:
OBJECTIVE To evaluate the role of an ultra-low-dose dual-source CT coronary angiography (CTCA) scan with high pitch for delimiting the range of the subsequent standard CTCA scan. METHODS 30 patients with an indication for CTCA were prospectively examined using a two-scan dual-source CTCA protocol (2.0 × 64.0 × 0.6 mm; pitch, 3.4; rotation time of 280 ms; 100 kV): Scan 1 was acquired with one-fifth of the tube current suggested by the automatic exposure control software [CareDose 4D™ (Siemens Healthcare, Erlangen, Germany) using 100 kV and 370 mAs as a reference] with the scan length from the tracheal bifurcation to the diaphragmatic border. Scan 2 was acquired with standard tube current extending with reduced scan length based on Scan 1. Nine central coronary artery segments were analysed qualitatively on both scans. RESULTS Scan 2 (105.1 ± 10.1 mm) was significantly shorter than Scan 1 (127.0 ± 8.7 mm). Image quality scores were significantly better for Scan 2. However, in 5 of 6 (83%) patients with stenotic coronary artery disease, a stenosis was already detected in Scan 1 and in 13 of 24 (54%) patients with non-stenotic coronary arteries, a stenosis was already excluded by Scan 1. Using Scan 2 as reference, the positive- and negative-predictive value of Scan 1 was 83% (5 of 6 patients) and 100% (13 of 13 patients), respectively. CONCLUSION An ultra-low-dose CTCA planning scan enables a reliable scan length reduction of the following standard CTCA scan and allows for correct diagnosis in a substantial proportion of patients. ADVANCES IN KNOWLEDGE Further dose reductions are possible owing to a change in the individual patient's imaging strategy as a prior ultra-low-dose CTCA scan may already rule out the presence of a stenosis or may lead to a direct transferal to an invasive catheter procedure.
Resumo:
Background Compared to the general population, Helicobacter pylori infection is more common among adults with intellectual disability (ID) and is associated with greater levels of disability, maladaptive behaviour, and institutionalization. Little information exists about the effects of eradication therapy in this group, so we aimed to evaluate: (1) success of a standard H. pylori eradication protocol; (2) frequency of side-effects; and (3) impact of eradication on level of functional ability and maladaptive behaviour. Method A cohort of adults with ID underwent assessment of their levels of function and maladaptive behaviour, medical history, physical examination, and H. pylori testing using serology and faecal antigen tests. Some received standard H. pylori eradication therapy. Twelve months later, participants underwent repeat assessment, were grouped by change in H. pylori status and compared. Results Of 168 participants, 117 (70%) were currently infected with H. pylori at baseline, and 96 (82%) of the 117 were given standard H. pylori eradication therapy. The overall eradication rate was 61% but 31% reported side-effects. Institutional status of the participants, their level of behaviour or function, and number of comorbid medical conditions were not associated with failure of eradication. There were no statistically significant differences in level of behaviour or function, ferritin, or weight between the groups in whom H. pylori was eradicated or stayed positive. Conclusion Adults with ID have lower H. pylori eradication and higher side-effect rates than the general population. Levels of maladaptive behaviour and disability did not improve with eradication and thus greater levels of maladaptive behaviour or disability appear to be risk factors for, rather than consequences of, H. pylori infection.