36 resultados para Inherent Audiences


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laminar and pulsed flows typical of multi-commuted and multi-pumping flow systems, were evaluated in relation to analytical procedures carried out at high temperatures. As application, the spectrophotometric determination of total reducing sugars (TRS, hydrolyzed sucrose plus reducing sugars) in sugar-cane juice and molasses was selected. The method involves in-line hydrolysis of sucrose and alkaline degradation of the reducing sugars at about 98 degrees C. Better results were obtained with pulsed flows, due to the efficient radial mass transport inherent to the multi-pumping flow system. The proposed system presents favorable characteristics of ruggedness, analytical precision (r.s.d. < 0.013 for typical samples), stability (no measurable baseline drift during 4-h working periods), linearity of the analytical curve (r > 0.992, n = 5, 0.05-0.50% w/v TRS) and sampling rate (65 h(-1)). Results are in agreement with ion chromatography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resuspended soil and other airborne particles adhered to the leaf surface affect the chemical composition of the plant. A well-defined cleaning procedure is necessary to avoid this problem, providing a correct assessment of the inherent chemical composition of bromeliads. To evaluate the influence of a washing procedure, INAA was applied for determining chemical elements in the leaves of bromeliads from Vriesea carinata species, both non-washed and washed with Alconox, EDTA and bi-distilled water. Br, Ce, Hg, La, Sc, Se, Sm and Th showed higher mass fractions in non-washed leaves. The washing procedure removed the exogenous material without leaching chemical elements from inside the tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-pumping flow systems exploit pulsed flows delivered by Solenoid pumps. Their improved performance rely on the enhanced radial mass transport inherent to the pulsed flow, which is a consequence of the establishment of vortices thus a tendency towards turbulent mixing. This paper presents several evidences of turbulent mixing in relation to pulsed flows. such as recorded peak shape, establishment of fluidized beds, exploitation of flow reversal, implementation of relatively slow chemical reactions and/or heating of the reaction medium. In addition, Reynolds number associated with the GO period of a pulsed flow is estimated and photographic images of dispersing samples flowing under laminar regime and pulsed flow conditions are presented. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim To describe the perceptions and attitudes of registered nurses (RNs) towards adverse events (AEs) in nursing care. Background The professionals` subjective perspectives should be taken into account for the prevention of AEs in care settings. Method Schutz`s social phenomenology was developed. Interviews were conducted with nine Intensive Care Unit RNs. Results The following five descriptive categories emerged: (1) the occurrence of AEs is inherent to the human condition but provokes a feeling of insecurity, (2) the occurrence of AEs indicates the existence of failures in health care systematization, (3) the professionals` attitudes towards AEs should be permeated by ethical principles; (4) the priority regarding AEs should be the mitigation of harm to patients, and (5) decisions regarding the communication of AEs were determined by the severity of the error. Conclusions The various subjective perspectives related to the occurrence of AEs requires a health care systematization with a focus on prevention. Ethical behaviour is essential for the patients` safety. Implications for nursing management Activities aimed at the prevention of AEs should be integrated jointly with both the professionals and the health care institution. A culture of safety, not punishment, and improvement in the quality of care provided to patients should be priorities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Medium carbon steels are mostly used for simple applications; however, new applications have been developed for which good sheet metal formability is required. These types of steels have an inherent low formability. A medium-carbon hot-rolled SAE 1050 steel was selected for this study. It has been cold rolled with thickness reductions varying between 7 and 80%. The samples obtained were used to evaluate the strain hardening curve. For samples with a 50 and 80% thickness reduction, an annealing heat treatment was performed to achieve recrystallization. The material was characterized in the ""as-received"", cold rolled and annealed conditions using several methods: optical metallography, X-ray diffraction (texture), Vickers hardness, and tensile testing. For large thickness reductions, the SAE 1050 steel presented low elongation, less than 2%, and yield strength (YS) and tensile strength (TS) around 1400 MPa. Texture in the ""as-received"" condition showed strong components on the {001} plane, in the < 100 >, < 210 > and (110) directions. After cold rolling, the texture did not present any significant changes for small thickness reductions, however. It changed completely for large ones, where gamma, < 111 >//ND, alpha, < 110 > HRD, and gamma prime, < 223 >//ND, fibres were strengthened. After annealing, the microstructure of the SAE 1050 steel was characterized by recrystallized ferrite and globular cementite. There was little change in the alpha fibre for the 50% reduction, whereas for the 80% reduction, its intensity increased. Both gamma and gamma prime fibres vanished upon annealing for 50 and 80% reductions alike. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Medium carbon steels are mostly used for simple applications; nevertheless new applications have been developed for which good sheet formability is required. This class of steels has an inherent low formability. A medium carbon hot rolled SAE 1050 steel has been selected for this study. It has been cold rolled with reductions in the 7-80% range. Samples have been used to assess the cold work hardening curve. For samples with a 50 and 80% thickness reduction, an annealing heat treatment has been performed to obtain recrystallization. The material has been characterized in the ""as received"", cold rolled and annealed conditions, using several methods: optical microscopy, X-ray diffraction (texture), Vickers hardness and tensile testing. The 50% cold rolled and recrystallized material has been further studied in terms of sheet metal formability and texture evolution during the actual stamping of a steel toecap that has been used to validate the finite element simulations. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel array RLS algorithm with forgetting factor that circumvents the problem of fading regularization, inherent to the standard exponentially-weighted RLS, by allowing for time-varying regularization matrices with generic structure. Simulations in finite precision show the algorithm`s superiority as compared to alternative algorithms in the context of adaptive beamforming.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, the rising competition for the use of water and environmental resources with consequent restrictions for farmers should change the paradigm in terms of irrigation concepts, or rather, in order to attain economical efficiency other than to supply water requirement for the crop. Therefore, taking into account the social and economical role of bean activity in Brazil, as well as the risk inherent to crop due to its high sensibility to both deficit and excessive water, the optimization methods regarding to irrigation management have become more interesting and essential. This study intends to present a way to determine the optimal water supply, considering different combinations between desired bean yield and level of risk, bringing as a result a graph with the former associated with the latter, depending on different water depths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The weed, known commonly as vassourinha de botao (buttonweed), is present in several crops in northern and north-eastern Brazil. Its occurrence is common in sugarcane and soybean crops in the states of Goias, Tocantins, and Maranhao. However, there is no published information in the literature about its taxonomic classification. Thus, this research aimed to classify taxonomically this species in order to develop a classification key based on the morphological characteristics among varieties of Borreria densiflora DC., as well as to illustrate it and provide a palynological basis to classify this species as a new variety For the classification process, data from the literature, morphological characteristics, and palynological evidence were considered. In this article, we describe a new variety, B. densiflora DC. var. latifolia E.L. Cabral & Martins. The new variety possesses a terrestrial habitat and it is a simple perennial weed species. These results show the importance of an accurate identification, as well as an understanding of the evolutionary changes inherent to weeds (like intraspecific variability), breeding system, genetic potential, and ecological studies. Those factors are essential to the beginning of a long-term weed management strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Food foams such as marshmallow, Chantilly and mousses have behavior and stability directly connected with their microstructure, bubble size distribution and interfacial properties. A high interfacial tension inherent to air/liquid foams interfaces affects its stability, and thus it has a direct impact on processing, storage and product handling. In this work, the interactions of egg albumin with various types of polysaccharides were investigated by drop tensiometry, interfacial rheology and foam stability. The progressive addition of egg albumin and polysaccharide in water induced a drop of the air-water surface tension which was dependent on the pH and polysaccharide type. At pH 4, that is below the isoeletric point of egg albumen (pI = 4.5) the surface tension was decreased from 70 mN/m to 42 mN/m by the presence of the protein, and from 70 mN/m to 43 mN/m, 40 mN/m and 38 mN/m by subsequent addition of xanthan, guar gum and kappa-carrageenan, respectively. At pH 7.5 the surface tension was decreased from 70 mN/m to 43 mN/m by the simultaneous presence of the protein and kappa-carrageenan. However, a higher surface tension of 48 and 50 mN/m was found when xanthan and guar gum were added, respectively, when compared with carrageenan addition. The main role on the stabilization of protein-polysaccharide stabilized interfaces was identified on the elasticity of the interface. Foam stability experiments confirmed that egg-albumin/kappa-carrageenan at pH below the protein isoeletric point are the most efficient systems to stabilize air/water interfaces. These results clearly indicate that protein-polysaccharide coacervation at the air/water interface is an efficient process to increase foam stability. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in the control of molecular engineering architectures have allowed unprecedented ability of molecular recognition in biosensing, with a promising impact for clinical diagnosis and environment control. The availability of large amounts of data from electrical, optical, or electrochemical measurements requires, however, sophisticated data treatment in order to optimize sensing performance. In this study, we show how an information visualization system based on projections, referred to as Projection Explorer (PEx), can be used to achieve high performance for biosensors made with nanostructured films containing immobilized antigens. As a proof of concept, various visualizations were obtained with impedance spectroscopy data from an array of sensors whose electrical response could be specific toward a given antibody (analyte) owing to molecular recognition processes. In addition to discussing the distinct methods for projection and normalization of the data, we demonstrate that an excellent distinction can be made between real samples tested positive for Chagas disease and Leishmaniasis, which could not be achieved with conventional statistical methods. Such high performance probably arose from the possibility of treating the data in the whole frequency range. Through a systematic analysis, it was inferred that Sammon`s mapping with standardization to normalize the data gives the best results, where distinction could be made of blood serum samples containing 10(-7) mg/mL of the antibody. The method inherent in PEx and the procedures for analyzing the impedance data are entirely generic and can be extended to optimize any type of sensor or biosensor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radiation dose calculations in nuclear medicine depend on quantification of activity via planar and/or tomographic imaging methods. However, both methods have inherent limitations, and the accuracy of activity estimates varies with object size, background levels, and other variables. The goal of this study was to evaluate the limitations of quantitative imaging with planar and single photon emission computed tomography (SPECT) approaches, with a focus on activity quantification for use in calculating absorbed dose estimates for normal organs and tumors. To do this we studied a series of phantoms of varying complexity of geometry, with three radionuclides whose decay schemes varied from simple to complex. Four aqueous concentrations of (99m)Tc, (131)I, and (111)In (74, 185, 370, and 740 kBq mL(-1)) were placed in spheres of four different sizes in a water-filled phantom, with three different levels of activity in the surrounding water. Planar and SPECT images of the phantoms were obtained on a modern SPECT/computed tomography (CT) system. These radionuclides and concentration/background studies were repeated using a cardiac phantom and a modified torso phantom with liver and ""tumor"" regions containing the radionuclide concentrations and with the same varying background levels. Planar quantification was performed using the geometric mean approach, with attenuation correction (AC), and with and without scatter corrections (SC and NSC). SPECT images were reconstructed using attenuation maps (AM) for AC; scatter windows were used to perform SC during image reconstruction. For spherical sources with corrected data, good accuracy was observed (generally within +/- 10% of known values) for the largest sphere (11.5 mL) and for both planar and SPECT methods with (99m)Tc and (131)I, but were poorest and deviated from known values for smaller objects, most notably for (111)In. SPECT quantification was affected by the partial volume effect in smaller objects and generally showed larger errors than the planar results in these cases for all radionuclides. For the cardiac phantom, results were the most accurate of all of the experiments for all radionuclides. Background subtraction was an important factor influencing these results. The contribution of scattered photons was important in quantification with (131)I; if scatter was not accounted for, activity tended to be overestimated using planar quantification methods. For the torso phantom experiments, results show a clear underestimation of activity when compared to previous experiment with spherical sources for all radionuclides. Despite some variations that were observed as the level of background increased, the SPECT results were more consistent across different activity concentrations. Planar or SPECT quantification on state-of-the-art gamma cameras with appropriate quantitative processing can provide accuracies of better than 10% for large objects and modest target-to-background concentrations; however when smaller objects are used, in the presence of higher background, and for nuclides with more complex decay schemes, SPECT quantification methods generally produce better results. Health Phys. 99(5):688-701; 2010

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent studies have demonstrated that spatial patterns of fMRI BOLD activity distribution over the brain may be used to classify different groups or mental states. These studies are based on the application of advanced pattern recognition approaches and multivariate statistical classifiers. Most published articles in this field are focused on improving the accuracy rates and many approaches have been proposed to accomplish this task. Nevertheless, a point inherent to most machine learning methods (and still relatively unexplored in neuroimaging) is how the discriminative information can be used to characterize groups and their differences. In this work, we introduce the Maximum Uncertainty Linear Discrimination Analysis (MLDA) and show how it can be applied to infer groups` patterns by discriminant hyperplane navigation. In addition, we show that it naturally defines a behavioral score, i.e., an index quantifying the distance between the states of a subject from predefined groups. We validate and illustrate this approach using a motor block design fMRI experiment data with 35 subjects. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although many mathematical models exist predicting the dynamics of transposable elements (TEs), there is a lack of available empirical data to validate these models and inherent assumptions. Genomes can provide a snapshot of several TE families in a single organism, and these could have their demographics inferred by coalescent analysis, allowing for the testing of theories on TE amplification dynamics. Using the available genomes of the mosquitoes Aedes aegypti and Anopheles gambiae, we indicate that such an approach is feasible. Our analysis follows four steps: (1) mining the two mosquito genomes currently available in search of TE families; (2) fitting, to selected families found in (1), a phylogeny tree under the general time-reversible (GTR) nucleotide substitution model with an uncorrelated lognormal (UCLN) relaxed clock and a nonparametric demographic model; (3) fitting a nonparametric coalescent model to the tree generated in (2); and (4) fitting parametric models motivated by ecological theories to the curve generated in (3).