888 resultados para In search of lost time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this in vitro study was to evaluate the effect of etching time on the tensile bond strength (TBS) of a conventional adhesive bonded to dentin previously irradiated with erbium:yttrium-aluminum-garnet (Er:YAG) and erbium, chromium:yttrium-scandium-gallium-garnet (Er,Cr:YSGG) lasers. Buccal and lingual surfaces of 45 third molars were flattened until the dentin was exposed and randomly assigned to three groups (n = 30) according to the dentin treatment: control (not irradiated), irradiated with Er:YAG (1 W; 250 mJ; 4 Hz; 80.6 J/cm(2)) laser or Er,Cr:YSGG (4 W; 200 mJ; 20 Hz; 71.4 J/cm(2)) laser, and into three subgroups (n = 10) according to acid etching time (15 s, 30 s or 60 s) for each experimental group. After acid etching, the adhesive was applied, followed by the construction of an inverted cone of composite resin. The samples were immersed in distilled water (37A degrees C for 24 h) and subjected to TBS test [50 kilogram-force (kgf), 0.5 mm/min]. Data were analyzed by analysis of variance (ANOVA) and Tukey statistical tests (P a parts per thousand currency signaEuro parts per thousand 0.05). Control group samples presented significant higher TBS values than those of all lased groups. Both irradiated groups exhibited similar TBS values. Samples subjected to the different etching times in each experimental group presented similar TBS. Based on the conditions of this in vitro study we concluded that Er:YAG and Er,Cr:YSGG laser irradiation of the dentin weakens the bond strength of the adhesive. Moreover, increased etching time is not able to modify the bonding strength of the adhesive to irradiated dentin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the implications of the searches based on H -> tau(+)tau-by the ATLAS and CMS collaborations on the parameter space of the two-Higgs-doublet model (2HDM). In the 2HDM, the scalars can decay into a tau pair with a branching ratio larger than the SM one, leading to constraints on the 2HDM parameter space. We show that in model II, values of tan beta > 1.8 are definitively excluded if the pseudoscalar is in the mass range 110 GeV < m(A) < 145 GeV. We have also discussed the implications for the 2HDM of the recent dimuon search by the ATLAS collaboration for a CP-odd scalar in the mass range 4-12 GeV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Haptoglobin assay, a highly sensitive method to detect intravascular hemolysis was carried out in the sera of 19 patients referred to Hospital Vital Brazil with the cutaneous form of loxoscelism in order to investigate the occurrence of mild intravascular hemolysis. Data from this series did not show decreased levels haptoglobin, ruling out intravascular hemolysis in these patients with cutaneous form of loxoscelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microcystin-LR (MC-LR) is a dangerous toxin found in environmental waters, quantified by high performance liquid chromatography and/or enzyme-linked immunosorbent assays. Quick, low cost and on-site analysis is thus required to ensure human safety and wide screening programs. This work proposes label-free potentiometric sensors made of solid-contact electrodes coated with a surface imprinted polymer on the surface of Multi-Walled Carbon NanoTubes (CNTs) incorporated in a polyvinyl chloride membrane. The imprinting effect was checked by using non-imprinted materials. The MC-LR sensitive sensors were evaluated, characterized and applied successfully in spiked environmental waters. The presented method offered the advantages of low cost, portability, easy operation and suitability for adaptation to flow methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real-time monitoring applications may be used in a wireless sensor network (WSN) and may generate packet flows with strict quality of service requirements in terms of delay, jitter, or packet loss. When strict delays are imposed from source to destination, the packets must be delivered at the destination within an end-to-end delay (EED) hard limit in order to be considered useful. Since the WSN nodes are scarce both in processing and energy resources, it is desirable that they only transport useful data, as this contributes to enhance the overall network performance and to improve energy efficiency. In this paper, we propose a novel cross-layer admission control (CLAC) mechanism to enhance the network performance and increase energy efficiency of a WSN, by avoiding the transmission of potentially useless packets. The CLAC mechanism uses an estimation technique to preview packets EED, and decides to forward a packet only if it is expected to meet the EED deadline defined by the application, dropping it otherwise. The results obtained show that CLAC enhances the network performance by increasing the useful packet delivery ratio in high network loads and improves the energy efficiency in every network load.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterococci are Gram-positive cocci saprophyte of the human gastrointestinal tract, diners who act as opportunistic pathogens. They can cause infections in patients hospitalized for a long time or who have received multiple antibiotic therapy. Enterococcus faecalis and Enterococcus faecium are the most common species in human infections. To evaluate the possibility of rapid detection of these species and their occurrence in the blood of newborns with suspected nosocomial infection, blood samples were collected from 50 newborns with late infections, admitted to the Neonatal Care Unit of the University Hospital Federal de Mato Grosso do Sul (UFMS-HU), from September 2010 to January 2011. The samples were subjected to conventional PCR and real time PCR (qPCR) to search for Enterococcus faecium and Enterococcus faecalis, respectively. The PCR results were compared with respective blood cultures from 40 patients. No blood cultures were positive for Enterococci, however, eight blood samples were identified as genomic DNA of Enterococcus faecium by qPCR and 22 blood samples were detected as genomic DNA of Enterococcus faecalis by conventional PCR. These findings are important because of the clinical severity of the evaluated patients who were found positive by conventional PCR and not through routine microbiological methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Face à estagnação da tecnologia uniprocessador registada na passada década, aos principais fabricantes de microprocessadores encontraram na tecnologia multi-core a resposta `as crescentes necessidades de processamento do mercado. Durante anos, os desenvolvedores de software viram as suas aplicações acompanhar os ganhos de performance conferidos por cada nova geração de processadores sequenciais, mas `a medida que a capacidade de processamento escala em função do número de processadores, a computação sequencial tem de ser decomposta em várias partes concorrentes que possam executar em paralelo, para que possam utilizar as unidades de processamento adicionais e completar mais rapidamente. A programação paralela implica um paradigma completamente distinto da programação sequencial. Ao contrário dos computadores sequenciais tipificados no modelo de Von Neumann, a heterogeneidade de arquiteturas paralelas requer modelos de programação paralela que abstraiam os programadores dos detalhes da arquitectura e simplifiquem o desenvolvimento de aplicações concorrentes. Os modelos de programação paralela mais populares incitam os programadores a identificar instruções concorrentes na sua lógica de programação, e a especificá-las sob a forma de tarefas que possam ser atribuídas a processadores distintos para executarem em simultâneo. Estas tarefas são tipicamente lançadas durante a execução, e atribuídas aos processadores pelo motor de execução subjacente. Como os requisitos de processamento costumam ser variáveis, e não são conhecidos a priori, o mapeamento de tarefas para processadores tem de ser determinado dinamicamente, em resposta a alterações imprevisíveis dos requisitos de execução. `A medida que o volume da computação cresce, torna-se cada vez menos viável garantir as suas restrições temporais em plataformas uniprocessador. Enquanto os sistemas de tempo real se começam a adaptar ao paradigma de computação paralela, há uma crescente aposta em integrar execuções de tempo real com aplicações interativas no mesmo hardware, num mundo em que a tecnologia se torna cada vez mais pequena, leve, ubíqua, e portável. Esta integração requer soluções de escalonamento que simultaneamente garantam os requisitos temporais das tarefas de tempo real e mantenham um nível aceitável de QoS para as restantes execuções. Para tal, torna-se imperativo que as aplicações de tempo real paralelizem, de forma a minimizar os seus tempos de resposta e maximizar a utilização dos recursos de processamento. Isto introduz uma nova dimensão ao problema do escalonamento, que tem de responder de forma correcta a novos requisitos de execução imprevisíveis e rapidamente conjeturar o mapeamento de tarefas que melhor beneficie os critérios de performance do sistema. A técnica de escalonamento baseado em servidores permite reservar uma fração da capacidade de processamento para a execução de tarefas de tempo real, e assegurar que os efeitos de latência na sua execução não afectam as reservas estipuladas para outras execuções. No caso de tarefas escalonadas pelo tempo de execução máximo, ou tarefas com tempos de execução variáveis, torna-se provável que a largura de banda estipulada não seja consumida por completo. Para melhorar a utilização do sistema, os algoritmos de partilha de largura de banda (capacity-sharing) doam a capacidade não utilizada para a execução de outras tarefas, mantendo as garantias de isolamento entre servidores. Com eficiência comprovada em termos de espaço, tempo, e comunicação, o mecanismo de work-stealing tem vindo a ganhar popularidade como metodologia para o escalonamento de tarefas com paralelismo dinâmico e irregular. O algoritmo p-CSWS combina escalonamento baseado em servidores com capacity-sharing e work-stealing para cobrir as necessidades de escalonamento dos sistemas abertos de tempo real. Enquanto o escalonamento em servidores permite partilhar os recursos de processamento sem interferências a nível dos atrasos, uma nova política de work-stealing que opera sobre o mecanismo de capacity-sharing aplica uma exploração de paralelismo que melhora os tempos de resposta das aplicações e melhora a utilização do sistema. Esta tese propõe uma implementação do algoritmo p-CSWS para o Linux. Em concordância com a estrutura modular do escalonador do Linux, ´e definida uma nova classe de escalonamento que visa avaliar a aplicabilidade da heurística p-CSWS em circunstâncias reais. Ultrapassados os obstáculos intrínsecos `a programação da kernel do Linux, os extensos testes experimentais provam que o p-CSWS ´e mais do que um conceito teórico atrativo, e que a exploração heurística de paralelismo proposta pelo algoritmo beneficia os tempos de resposta das aplicações de tempo real, bem como a performance e eficiência da plataforma multiprocessador.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: HTLV-1/2 screening among blood donors commonly utilizes an enzyme-linked immunosorbent assay (EIA), followed by a confirmatory method such as Western blot (WB) if the EIA is positive. However, this algorithm yields a high rate of inconclusive results, and is expensive. METHODS: Two qualitative real-time PCR assays were developed to detect HTLV-1 and 2, and a total of 318 samples were tested (152 blood donors, 108 asymptomatic carriers, 26 HAM/TSP patients and 30 seronegative individuals). RESULTS: The sensitivity and specificity of PCR in comparison with WB results were 99.4% and 98.5%, respectively. PCR tests were more efficient for identifying the virus type, detecting HTLV-2 infection and defining inconclusive cases. CONCLUSIONS: Because real-time PCR is sensitive and practical and costs much less than WB, this technique can be used as a confirmatory test for HTLV in blood banks, as a replacement for WB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The value given by commuters to the variability of travel times is empirically analysed using stated preference data from Barcelona (Spain). Respondents are asked to choose between alternatives that differ in terms of cost, average travel time, variability of travel times and departure time. Different specifications of a scheduling choice model are used to measure the influence of various socioeconomic characteristics. Our results show that travel time variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concomitant immunity in the presence of repeated infections (with 15 cercariae) was studied in mice sacrificed on the 20th day after each infection. The comparison of the averages of immature worms, recovered from mice submitted to reinfection, with those of their respective controls (previously uninfected) showed a significantly lower worm recovery rate in the animals with previous infections (concomitant immunity). However, statiscally significant differences could not be detected among the various groups of animals, when the mice that accumulated worms in this mature stage were perfused. The theoretical projection based on the accumulation of young worms which developed to adult ones indicates a lower recovery rate of adult worms in the animals with concomitant immunity, but this projection was not corroborated by the experimental data. The visceral hemodynamic alterations that occurred in reinfections due to the pathogeny, favouring recirculation of the recent arriving worms to the other organs on the occasion of perfusion of the portal system. These results suggest that special care should be taken when one wants to investigate concomitant immunity in mice based on the distinction of the immature worms from challenge infection and the mature ones from primary infection.