979 resultados para low level laser
Resumo:
Pós-graduação em Odontologia - FOA
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Bases Gerais da Cirurgia - FMB
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The Hsp70 is an essential molecular chaperone in protein metabolism since it acts as a pivot with other molecular chaperone families. Several co-chaperones act as regulators of the Hsp70 action cycle, as for instance Hip (Hsp70-interacting protein). Hip is a tetratricopeptide repeat protein (TPR) that interacts with the ATPase domain in the Hsp70-ADP state, stabilizing it and preventing substrate dissociation. Molecular chaperones from protozoans, which can cause some neglected diseases, are poorly studied in terms of structure and function. Here, we investigated the structural features of Hip from the protozoa Leishmania braziliensis (LbHip), one of the causative agents of the leishmaniasis disease. LbHip was heterologously expressed and purified in the folded state, as attested by circular dichroism and intrinsic fluorescence emission techniques. LbHip forms an elongated dimer, as observed by analytical gel filtration chromatography, analytical ultracentrifugation and small angle X-ray scattering (SAXS). With the SAXS data a low resolution model was reconstructed, which shed light on the structure of this protein, emphasizing its elongated shape and suggesting its domain organization. We also investigated the chemical-induced unfolding behavior of LbHip and two transitions were observed. The first transition was related to the unfolding of the TPR domain of each protomer and the second transition of the dimer dissociation. Altogether. LbHip presents a similar structure to mammalian Hip, despite their low level of conservation, suggesting that this class of eukaryotic protein may use a similar mechanism of action. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Traditional supervised data classification considers only physical features (e. g., distance or similarity) of the input data. Here, this type of learning is called low level classification. On the other hand, the human (animal) brain performs both low and high orders of learning and it has facility in identifying patterns according to the semantic meaning of the input data. Data classification that considers not only physical attributes but also the pattern formation is, here, referred to as high level classification. In this paper, we propose a hybrid classification technique that combines both types of learning. The low level term can be implemented by any classification technique, while the high level term is realized by the extraction of features of the underlying network constructed from the input data. Thus, the former classifies the test instances by their physical features or class topologies, while the latter measures the compliance of the test instances to the pattern formation of the data. Our study shows that the proposed technique not only can realize classification according to the pattern formation, but also is able to improve the performance of traditional classification techniques. Furthermore, as the class configuration's complexity increases, such as the mixture among different classes, a larger portion of the high level term is required to get correct classification. This feature confirms that the high level classification has a special importance in complex situations of classification. Finally, we show how the proposed technique can be employed in a real-world application, where it is capable of identifying variations and distortions of handwritten digit images. As a result, it supplies an improvement in the overall pattern recognition rate.
Resumo:
To assess adherence to proton pump inhibitor (PPI) treatment and associated variables in patients with gastroesophageal reflux disease (GERD). Cross-sectional and prospective comprising 240 consecutive adult patients, diagnosed with GERD for whom continuous use of standard or double dose of omeprazole had been prescribed. Patients were ranked as ne-GERD (162: 67.5%) or e-GERD classified according to the Los Angeles classification as A (48:20.0%), B (21:8.6%), C (1:0.5%), D (1:0.5%), and Barrett's esophagus (7:2.9%). The Morisky questionnaire was applied to assess adherence to therapy and a GERD questionnaire to assess symptoms and their impact. Adherence was correlated with demographics, cotherapies, comorbidities, treatment duration, symptoms scores, endoscopic findings, and patient awareness of their disease. 126 patients (52.5%) exhibited high level of adherence and 114 (47.5%) low level. Youngers (P= 0.002) or married (O.R. 2.41, P= 0.03 vs. widowers) patients had lower levels of adherence; symptomatic patients exhibited lower adherence (P= 0.02). All other variables studied had no influence on adherence. Patients with GERD attending a tertiary referral hospital in Sao Paulo exhibited a high rate of low adherence to the prescribed PPI therapy that may play a role in the therapy failure. Age <60 years, marital status and being symptomatic were risk factors for low adherence.
Resumo:
Over the last few years, low-level light therapy (LLLT) has shown an incredible suitability for a wide range of applications for central nervous system (CNS) related diseases. In this therapeutic modality light dosimetry is extremely critical so the study of light propagation through the CNS organs is of great importance. To better understand how light intensity is delivered to the most relevant neural sites we evaluated optical transmission through slices of rat brain point by point. We experimented red (λ = 660 nm) and near infrared (λ = 808 nm) diode laser light analyzing the light penetration and distribution in the whole brain. A fresh Wistar rat (Rattus novergicus) brain was cut in sagittal slices and illuminated with a broad light beam. A high-resolution digital camera was employed to acquire data of transmitted light. Spatial profiles of the light transmitted through the sample were obtained from the images. Peaks and valleys in the profiles show sites where light was less or more attenuated. The peak intensities provide information about total attenuation and the peak widths are correlated to the scattering coefficient at that individual portion of the sample. The outcomes of this study provide remarkable information for LLLT dose-dependent studies involving CNS and highlight the importance of LLLT dosimetry in CNS organs for large range of applications in animal and human diseases.
Resumo:
Virtual machines emulating hardware devices are generally implemented in low-level languages and using a low-level style for performance reasons. This trend results in largely difficult to understand, difficult to extend and unmaintainable systems. As new general techniques for virtual machines arise, it gets harder to incorporate or test these techniques because of early design and optimization decisions. In this paper we show how such decisions can be postponed to later phases by separating virtual machine implementation issues from the high-level machine-specific model. We construct compact models of whole-system VMs in a high-level language, which exclude all low-level implementation details. We use the pluggable translation toolchain PyPy to translate those models to executables. During the translation process, the toolchain reintroduces the VM implementation and optimization details for specific target platforms. As a case study we implement an executable model of a hardware gaming device. We show that our approach to VM building increases understandability, maintainability and extendability while preserving performance.
Resumo:
Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.
Resumo:
The isotopic abundance of 85Kr in the atmosphere, currently at the level of 10−11, has increased by orders of magnitude since the dawn of nuclear age. With a half-life of 10.76 years, 85Kr is of great interest as tracers for environmental samples such as air, groundwater and ice. Atom Trap Trace Analysis (ATTA) is an emerging method for the analysis of rare krypton isotopes at isotopic abundance levels as low as 10−14 using krypton gas samples of a few micro-liters. Both the reliability and reproducibility of the method are examined in the present study by an inter-comparison among different instruments. The 85Kr/Kr ratios of 12 samples, in the range of 10−13 to 10−10, are measured independently in three laboratories: a low-level counting laboratory in Bern, Switzerland, and two ATTA laboratories, one in Hefei, China, and another in Argonne, USA. The results are in agreement at the precision level of 5%.
Resumo:
The reconstruction of low-latitude ocean-atmosphere interactions is one of the major issues of (paleo-)environmental studies. The trade winds, extending over 20° to 30° of latitude in both hemispheres, between the subtropical highs and the intertropical convergence zone, are major components of the atmospheric circulation and little is known about their long-term variability on geological time-scales, in particular in the Pacific sector. We present the modern spatial pattern of eolian-derived marine sediments in the eastern equatorial and subtropical Pacific (10°N to 25°S) as a reference data set for the interpretation of SE Pacific paleo-dust records. The terrigenous silt and clay fractions of 75 surface sediment samples have been investigated for their grain-size distribution and clay-mineral compositions, respectively, to identify their provenances and transport agents. Dust delivered to the southeast Pacific from the semi- to hyper-arid areas of Peru and Chile is rather fine-grained (4-8 µm) due to low-level transport within the southeast trade winds. Nevertheless, wind is the dominant transport agent and eolian material is the dominant terrigenous component west of the Peru-Chile Trench south of ~ 5°S. Grain-size distributions alone are insufficient to identify the eolian signal in marine sediments due to authigenic particle formation on the sub-oceanic ridges and abundant volcanic glass around the Galapagos Islands. Together with the clay-mineral compositions of the clay fraction, we have identified the dust lobe extending from the coasts of Peru and Chile onto Galapagos Rise as well as across the equator into the doldrums. Illite is a very useful parameter to identify source areas of dust in this smectite-dominated study area.
Resumo:
Most implementations of parallel logic programming rely on complex low-level machinery which is arguably difflcult to implement and modify. We explore an alternative approach aimed at taming that complexity by raising core parts of the implementation to the source language level for the particular case of and-parallelism. Therefore, we handle a signiflcant portion of the parallel implementation mechanism at the Prolog level with the help of a comparatively small number of concurrency-related primitives which take care of lower-level tasks such as locking, thread management, stack set management, etc. The approach does not eliminate altogether modiflcations to the abstract machine, but it does greatly simplify them and it also facilitates experimenting with different alternatives. We show how this approach allows implementing both restricted and unrestricted (i.e., non fork-join) parallelism. Preliminary experiments show that the amount of performance sacriflced is reasonable, although granularity control is required in some cases. Also, we observe that the availability of unrestricted parallelism contributes to better observed speedups.
Resumo:
Este estudo possui duas partes distintas: 1. in vivo (randomizado e longitudinal) que teve como objetivo avaliar protocolos de tratamento para hipersensibilidade dentinária com laser de baixa potência (com diferentes dosagens), laser de alta potência e agente dessensibilizante, por um período de 12 e 18 meses; e 2. in vitro que teve como objetivo analisar a perda de estrutura de dois dentifrícios distintos (Colgate Total 12 e Colgate Pró Alívio) e analisar a permeabilidade dentinária dos tratamentos da etapa 01, associados aos dentifrícios, após diferentes ciclos de abrasão. Na parte in vivo, as lesões cervicais não cariosas de 32 voluntários, previamente submetidos aos critérios de elegibilidade ou exclusão, foram divididas em nove grupos (n=10): G1: Gluma Desensitizer (Heraeus Kulzer), G2: Laser de baixa potência com baixa dosagem (Photon Lase, DMC) (três pontos de irradiação vestibulares e um ponto apical: 30 mW, 10 J/cm2, 9 seg por ponto com o comprimento de onda de 810nm). Foram realizadas três sessões com um intervalo de 72 horas), G3: Laser de baixa potência com alta dosagem (um ponto cervical e um ponto apical: 100 mW, 90 J/cm2, 11 seg por ponto com o comprimento de onda de 810nm. Foram realizadas três sessões com um intervalo de 72 horas), G4: Laser de baixa potência com baixa dosagem + Gluma Desensitizer, G5: Laser de baixa potência com alta dosagem + Gluma Desensitizer, G6: Laser de Nd:YAG (Power LaserTM ST6, Lares Research®), em contato com a superfície dental: 1,0W, 10 Hz e 100 mJ, ? 85 J/cm2, com o comprimento de onda de 1064nm, G7: Laser de Nd:YAG + Gluma Desensitizer, G8: Laser de Nd:YAG + Laser de baixa potência com baixa dosagem, G9: Laser de Nd:YAG + Laser de baixa potência com alta dosagem. O nível de sensibilidade de cada voluntário foi avaliado através da escala visual analógica de dor (VAS) com auxílio do ar da seringa tríplice e exploração com sonda após 12 e 18 meses do tratamento. Na parte 02, in vitro, foram utilizados terceiros molares humanos não irrompidos e recém-extraídos. Todos foram limpos e tiveram suas raízes separadas das coroas. As raízes foram seccionadas em quadrados de dentina com dimensões de 4x4x2 mm, os quais foram embutidos em resina Epoxi e devidamente polidos até uma curvatura de 0,3 ?m, analisados em perfilometria ótica. Estes foram imersos em solução de EDTA 17% por 2min para abertura dos túbulos e armazenados em uma solução de Soro Fetal Bovino diluído em salina tamponada com fosfato. Os espécimes foram divididos aleatoriamente em 12 grupos (n=10) G1: Sem tratamento de superfície, sem dentifrício; G2: Nd:YAG/sem dentifrício; G3: Gluma/sem dentifrício; G4: Nd:YAG + Gluma/sem dentifrício; G5: Sem tratamento de superfície/Colgate Total 12; G6: Nd:YAG/Colgate Total 12; G7: Gluma/Colgate Total 12; G8: Nd:YAG + Gluma/Colgate Total 12; G9: Sem tratamento de superfície/Colgate Pró Alívio; G10: Nd:YAG/Colgate Pró Alívio; G11: Gluma/Colgate Pró Alívio; G12: Nd:YAG + Gluma/Colgate Pró Alívio. Em seguida, as superfícies receberam a aplicação de fitas adesivas nas duas margens, mantendo uma área central de teste exposta de 4 x 1 mm, onde foram realizados os tratamentos de superfície e os ciclos de abrasão correspondentes a 1, 7, 30 e 90 dias de escovação (52 ciclos, 210 segundos de contato com o slurry; 361 ciclos, 1470 segundos de contato com o slurry; 1545 ciclos, 6300 segundos de contato com o slurry; 4635 ciclos, 18900 segundos de contato com o slurry, respectivamente). A cada etapa de abrasão, foi realizada análise em Perfilometria Ótica. Para as analises de permeabilidade e Microscopia Eletrônica de Varredura, foram utilizadas amostras circulares de 6 mm de diâmetro e 1 mm de espessura de dentina obtidas das coroas dentais. Estas foram divididas aleatoriamente nos mesmos grupos já descritos anteriormente, sendo que 120 espécimes foram utilizados para permeabilidade (n=10) e 36 para MEV (n=3). Ambas as análises foram realizadas após imersão no EDTA; após tratamentos para a sensibilidade; pós 1 dia, 7 dias, 30 dias e 90 dias de escovação. Após análise estatística pode-se concluir que, in vivo, todos os tratamentos foram eficazes para a redução da hipersensibilidade dentinária. Ainda que o nível da sensibilidade dos pacientes aumentou numericamente, estes não são considerados estatisticamente diferentes a partir de 12 meses. Portanto, até a avaliação de 18 meses, podemos concluir que não houve um aumento na sensibilidade dentinária desde a sua diminuição pós-tratamento. In vitro, pode-se concluir que todos os tratamentos foram capazes de diminuir a permeabilidade dentinária. O dentifrício Total 12 apresentou-se como o mais abrasivo em comparação com o dentifrício Pro Alivio, pois este último promoveu uma perda de estrutura menor, porém ambos não apresentaram aumento na permeabilidade nos tempos de escovação. As microscopias eletrônicas de varredura mostram a formação da smear layer, obliterando os túbulos para ambos os dentifricios. Como conclusão, pode-se afirmar que todos os agentes dessensibilizantes foram efetivos, mesmo apresentando estratégias de ação diferentes. Os dentifrícios são igualmente interessantes para o uso caseiro por ocasionarem oclusão tubular e a associação de tratamentos (caseiro e de consultório) parece ser uma alternativa eficaz no tratamento da hipersensibilidade dentinária.