46 resultados para Conventional methodologies
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Pathogen detection in foods by reliable methodologies is very important to guarantee microbilogical safety. However, peculiar characteristics of certain foods, such as autochthonous microbiota, can directly influence pathogen development and detection. With the objective of verifying the performance of the official analytical methodologies for the isolation of Listeria monocytogenes and Salmonella in milk, different concentrations of these pathogens were inoculated in raw milk treatments with different levels of mesophilic aerobes, and then submitted to the traditional isolation procedures for the inoculated pathogens. Listeria monocytogenes was inoculated at the range of 0.2-5.2 log CFU/mL in treatments with 1.8-8.2 log CFU/mL. Salmonella Enteritidis was inoculated at 0.9-3.9 log CFU/mL in treatments with 3.0-8.2 log CFU/mL. The results indicated that recovery was not possible or was more difficult in the treatments with high counts of mesophilic aerobes and low levels of the pathogens, indicating interference of raw milk autochthonous microbiota. This interference was more evident for L. monocytogenes, once the pathogen recovery was not possible in treatments with mesophilic aerobes up to 4.0 log CFU/mL and inoculum under 2.0 log CFU/mL. For S. Enteritidis the interference appeared to be more non-specific. (C) 2007 Elsevier GmbH. All rights reserved.
Resumo:
For centuries, specific instruments or regular toothbrushes have routinely been used to remove tongue biofilm and improve breath odor. Toothbrushes with a tongue scraper on the back of their head have recently been introduced to the market. The present study compared the effectiveness of a manual toothbrush with this new design, i.e., possessing a tongue scraper, and a commercial tongue scraper in improving breath odor and reducing the aerobic and anaerobic microbiota of tongue surface. The evaluations occurred at 4 moments, when the participants (n=30) had their halitosis quantified with a halimeter and scored according to a 4-point scoring system corresponding to different levels of intensity. Saliva was collected for counts of aerobic and anaerobic microorganisms. Data were analyzed statistically by Friedman's test (p<0.05). When differences were detected, the Wilcoxon test adjusted for Bonferroni correction was used for multiple comparisons (group to group). The results confirmed the importance of mechanical cleaning of the tongue, since this procedure provided an improvement in halitosis and reduction of aerobe and anaerobe counts. Regarding the evaluated methods, the toothbrush's tongue scraper and conventional tongue scraper had a similar performance in terms of breath improvement and reduction of tongue microbiota, and may be indicated as effective methods for tongue cleaning.
Resumo:
This in vivo study evaluated the dissociation quality of maxillary premolar roots combining variations of vertical and horizontal angulations by using X-ray holders (Rinn -XCP), and made a comparison between two types of intraoral radiography systems - conventional film (Kodak Insight, Rochester, USA) and digital radiography (Kodak RVG 6100, Kodak, Rochester, USA). The study sample was comprised of 20 patients with a total of 20 maxillary premolars that were radiographed, using the paralleling angle technique (GP), with a 20º variation of the horizontal angle (GM) and 25º variation of the horizontal angle combined with 15º vertical angle (GMV). Each image was independently analyzed by two experienced examiners. These examiners assigned a score to the diagnostic capability of root dissociation and the measurement of the distance between the apexes. Statistical data was derived using the Wilcoxon Signed Rank test, Friedman and T test. The means of the measured distances between buccal and lingual root apexes were greater for the GMV, which ranged from 2.3 mm to 3.3 mm. A statistically significant difference was found between GM and GMV when compared to GP with p < 0.01. An established best diagnostic dissociation roots image was found in the GMV. These results support the use of the anterior X-ray holders which offer a better combined deviation (GMV) to dissociate maxillary premolar roots in both radiography systems.
Resumo:
The aim of this study was to evaluate the following acrylic resins: Clássico®, QC-20® and Lucitone®, recommended specifically for thermal polymerization, and Acron MC® and VIPI-WAVE®, made for polymerization by microwave energy. The resins were evaluated regarding their surface nanohardness and modulus of elasticity, while varying the polymerization time recommended by the manufacturer. They were also compared as to the presence of water absorbed by the samples. The technique used was nanoindentation, using the Nano Indenter XP®, MTS. According to an intra-group analysis, when using the polymerization time recommended by the manufacturer, a variation of 0.14 to 0.23 GPa for nanohardness and 2.61 to 3.73 GPa for modulus of elasticity was observed for the thermally polymerized resins. The variation for the resins made for polymerization by microwave energy was 0.15 to 0.22 GPa for nanohardness and 2.94 to 3.73 GPa for modulus of elasticity. The conclusion was that the Classico® resin presented higher nanohardness and higher modulus of elasticity values when compared to those of the same group, while Acron MC® presented the highest values for the same characteristics when compared to those of the same group. The water absorption evaluation showed that all the thermal polymerization resins, except for Lucitone®, presented significant nanohardness differences when submitted to dehydration or rehydration, while only Acron MC® presented no significant differences when submitted to a double polymerization time. Regarding the modulus of elasticity, it was observed that all the tested materials and products, except for Lucitone®, showed a significant increase in modulus of elasticity when submitted to a lack of hydration.
Resumo:
A practical method for the structural assignment of 3,4-O-benzylidene-D-ribono-1,5-lactones and analogues using conventional NMR techniques and NOESY measurements in solution is described. 2-O-Acyl-3,4-O-benzylidene-D-ribono-1,5-lactones were prepared in good yields by acylation of Zinner’s lactone with acyl chlorides under mildly basic conditions. Structural determination of 2-O-(4-nitrobenzoyl)-3,4-O-benzylidene-D-ribono-1,5-lactone was achieved by single crystal x-ray diffraction, which supports the results based on spectroscopic data.
Resumo:
A total of 316 samples of nasopharyngeal aspirate from infants up to two years of age with acute respiratory-tract illnesses were processed for detection of respiratory syncytial virus (RSV) using three different techniques: viral isolation, direct immunofluorescence, and PCR. Of the samples, 36 (11.4%) were positive for RSV, considering the three techniques. PCR was the most sensitive technique, providing positive findings in 35/316 (11.1%) of the samples, followed by direct immunofluorescence (25/316, 7.9%) and viral isolation (20/315, 6.3%) (p < 0.001). A sample was positive by immunofluorescence and negative by PCR, and 11 (31.4%) were positive only by RT-PCR. We conclude that RT-PCR is more sensitive than IF and viral isolation to detect RSV in nasopharyngeal aspirate specimens in newborn and infants.
Resumo:
In this paper we discuss the use of photonic crystal fibers (PCFs) as discrete devices for simultaneous wideband dispersion compensation and Raman amplification. The performance of the PCFs in terms of gain, ripple, optical signal-to-noise ratio (OSNR) and required fiber length for complete dispersion compensation is compared with conventional dispersion compensating fibers (DCFs). The main goal is to determine the minimum PCF loss beyond which its performance surpasses a state-of-the-art DCF and justifies practical use in telecommunication systems. (C) 2009 Optical Society of America
Resumo:
With the increased incidence of cancer and a similarly increased number of surgeries for insertion of silicone breast implants, it is necessary to assess the effect of such material within the breast tissue, particularly in mammography, because of the reduction in the power of breast cancer diagnosis. In this work, we introduce a breast phantom with silicone implants in order to evaluate the influence of the implant on the visibility of the main mammographic findings: fibers, microcalcifications and tumor masses. In this proposed phantom, the breast tissue was simulated using gel paraffin. In the optical density of phantom mammograms with implants, a reduction in breast tissue visibility was seen corresponding to 23% when compared to a phantom without silicone implants. This poor visibility was due to the X-ray beam scattering on silicone material; this effect produced a loss of visibility in the areas adjacent to the implant. It is expected that the proposed phantom model may be used as a device for the establishment of a technical standard for these types of procedures.
Resumo:
The solvent effects on the low-lying absorption spectrum and on the (15)N chemical shielding of pyrimidine in water are calculated using the combined and sequential Monte Carlo simulation and quantum mechanical calculations. Special attention is devoted to the solute polarization. This is included by an iterative procedure previously developed where the solute is electrostatically equilibrated with the solvent. In addition, we verify the simple yet unexplored alternative of combining the polarizable continuum model (PCM) and the hybrid QM/MM method. We use PCM to obtain the average solute polarization and include this in the MM part of the sequential QM/MM methodology, PCM-MM/QM. These procedures are compared and further used in the discrete and the explicit solvent models. The use of the PCM polarization implemented in the MM part seems to generate a very good description of the average solute polarization leading to very good results for the n-pi* excitation energy and the (15)N nuclear chemical shield of pyrimidine in aqueous environment. The best results obtained here using the solute pyrimidine surrounded by 28 explicit water molecules embedded in the electrostatic field of the remaining 472 molecules give the statistically converged values for the low lying n-pi* absorption transition in water of 36 900 +/- 100 (PCM polarization) and 36 950 +/- 100 cm(-1) (iterative polarization), in excellent agreement among one another and with the experimental value observed with a band maximum at 36 900 cm(-1). For the nuclear shielding (15)N the corresponding gas-water chemical shift obtained using the solute pyrimidine surrounded by 9 explicit water molecules embedded in the electrostatic field of the remaining 491 molecules give the statistically converged values of 24.4 +/- 0.8 and 28.5 +/- 0.8 ppm, compared with the inferred experimental value of 19 +/- 2 ppm. Considering the simplicity of the PCM over the iterative polarization this is an important aspect and the computational savings point to the possibility of dealing with larger solute molecules. This PCM-MM/QM approach reconciles the simplicity of the PCM model with the reliability of the combined QM/MM approaches.
Resumo:
Tomato is amongst the most consumed vegetables in the world, not only for its culinary versatility but also for its high nutritional value. In the last years, consumers have shown an increased concern regarding food origin and safety. The organic tomato production has been a promising alternative for the consumer offering a safer food in relation to environmental, social and nutritional aspects. This study assessed the chemical composition of tomato seeds produced in both conventional and organic systems by INAA. The results showed significant differences (P <= 0.05) in the mass fractions of Br, Cs, Eu, Fe, K, Mo, Na, Rb and Sm between both systems, indicating influence of the crop management adopted in the different tomato production systems.
Resumo:
This letter shows that the matrix can be used for redundancy and observability analysis of metering systems composed of PMU measurements and conventional measurements (power and voltage magnitude measurements). The matrix is obtained via triangular factorization of the Jacobian matrix. Observability analysis and restoration is carried out during the triangular factorization of the Jacobian matrix, and the redundancy analysis is made exploring the matrix structure. As a consequence, the matrix can be used for metering system planning considering conventional and PMU measurements. These features of the matrix will be outlined and illustrated by numerical examples.
Resumo:
A round robin program zoos conducted to assess the ability of three different X-radiographic systems for imaging internal fatigue cracks in riveted lap joints of composite glass reinforced fiber/metal laminate. From an engineering perspective, conventional film radiography and direct radiography have produced the best results, identifying and characterizing in detail internal damage on metallic faying surfaces of fastened glass reinforced fiber/metal laminate joints. On the other hand, computed radiographic images presented large projected geometric distortions and feature shifts due to the angular incident radiation beam, disclosing only partial internal cracking patterns.
Resumo:
In this paper, a comparative analysis of the long-term electric power forecasting methodologies used in some South American countries, is presented. The purpose of this study is to compare and observe if such methodologies have some similarities, and also examine the behavior of the results when they are applied to the Brazilian electric market. The abovementioned power forecasts were performed regarding the main four consumption classes (residential, industrial, commercial and rural) which are responsible for approximately 90% of the national consumption. The tool used in this analysis was the SAS (c) program. The outcome of this study allowed identifying various methodological similarities, mainly those related to the econometric variables used by these methods. This fact strongly conditioned the comparative results obtained.
Resumo:
Ti(6)Al(4)V thin films were grown by magnetron sputtering on a conventional austenitic stainless steel. Five deposition conditions varying both the deposition chamber pressure and the plasma power were studied. Highly textured thin films were obtained, their crystallite size (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.