936 resultados para Target Field Method
Investigation and optimization of parameters affecting the multiply charged ion yield in AP-MALDI MS
Resumo:
Liquid matrix-assisted laser desorption/ionization (MALDI) allows the generation of predominantly multiply charged ions in atmospheric pressure (AP) MALDI ion sources for mass spectrometry (MS) analysis. The charge state distribution of the generated ions and the efficiency of the ion source in generating such ions crucially depend on the desolvation regime of the MALDI plume after desorption in the AP-tovacuum inlet. Both high temperature and a flow regime with increased residence time of the desorbed plume in the desolvation region promote the generation of multiply charged ions. Without such measures the application of an electric ion extraction field significantly increases the ion signal intensity of singly charged species while the detection of multiply charged species is less dependent on the extraction field. In general, optimization of high temperature application facilitates the predominant formation and detection of multiply charged compared to singly charged ion species. In this study an experimental setup and optimization strategy is described for liquid AP-MALDI MS which improves the ionization effi- ciency of selected ion species up to 14 times. In combination with ion mobility separation, the method allows the detection of multiply charged peptide and protein ions for analyte solution concentrations as low as 2 fmol/lL (0.5 lL, i.e. 1 fmol, deposited on the target) with very low sample consumption in the low nL-range.
Resumo:
The objective of this study was to test a device developed to improve the functionality, accuracy and precision of the original technique for sweating rate measurements proposed by Schleger and Turner [Schleger AV, Turner HG (1965) Aust J Agric Res 16:92-106]. A device was built for this purpose and tested against the original Schleger and Turner technique. Testing was performed by measuring sweating rates in an experiment involving six Mertolenga heifers subjected to four different thermal levels in a climatic chamber. The device exhibited no functional problems and the results obtained with its use were more consistent than with the Schleger and Turner technique. There was no difference in the reproducibility of the two techniques (same accuracy), but measurements performed with the new device had lower repeatability, corresponding to lower variability and, consequently, to higher precision. When utilizing this device, there is no need for physical contact between the operator and the animal to maintain the filter paper discs in position. This has important advantages: the animals stay quieter, and several animals can be evaluated simultaneously. This is a major advantage because it allows more measurements to be taken in a given period of time, increasing the precision of the observations and diminishing the error associated with temporal hiatus (e.g., the solar angle during field studies). The new device has higher functional versatility when taking measurements in large-scale studies (many animals) under field conditions. The results obtained in this study suggest that the technique using the device presented here could represent an advantageous alternative to the original technique described by Schleger and Turner.
Resumo:
Sea surface gradients derived from the Geosat and ERS-1 satellite altimetry geodetic missions were integrated with marine gravity data from the National Geophysical Data Center and Brazilian national surveys. Using the least squares collocation method, models of free-air gravity anomaly and geoid height were calculated for the coast of Brazil with a resolution of 2` x 2`. The integration of satellite and shipborne data showed better statistical results in regions near the coast than using satellite data only, suggesting an improvement when compared to the state-of-the-art global gravity models. Furthermore, these results were obtained with considerably less input information than was used by those reference models. The least squares collocation presented a very low content of high-frequency noise in the predicted gravity anomalies. This may be considered essential to improve the high resolution representation of the gravity field in regions of ocean-continent transition. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Definition of the long-term variation of the geomagnetic virtual dipole moment requires more reliable paleointensity results. Here, we applied a multisample protocol to the study of the 130.5 Ma Ponta Grossa basaltic dikes (southern Brazil) that carry a very stable dual-polarity magnetic component. The magnetic stability of the samples wits checked using thermomagnetic curves and by monitoring the magnetic Susceptibility evolution through the paleointensity experiments. Twelve sites containing the least alterable samples were chosen for the paleointensity measurements. Although these rocks failed stepwise double-heating experiments, they yielded coherent results in the multisample method for all sites but one. The coherent sites show low to moderate field intensities between 5.7 +/- 0.2 and 26.4 +/- 0.7 mu T (average 13.4 +/- 1.9 mu T). Virtual dipole moments for these sites range from 1.3 +/- 0.04 to 6.0 +/- 0.2 x 10(22) A m(2) (average 2.9 +/- 0.5 x 10(22) A m(2)). Our results agree with the tendency for low dipole moments during the Early Cretaceous, immediately prior to the Cretaceous Normal Superchron (CNS). The available paleointensity database shows a strong variability of the field between 80 and 160 Ma. There seems to be no firm evidence for a Mesozoic Dipole Low, but a long-term tendency does emerge from the data with the highest dipole moments Occurring at the middle of the CNS.
Resumo:
We introduce a flexible technique for interactive exploration of vector field data through classification derived from user-specified feature templates. Our method is founded on the observation that, while similar features within the vector field may be spatially disparate, they share similar neighborhood characteristics. Users generate feature-based visualizations by interactively highlighting well-accepted and domain specific representative feature points. Feature exploration begins with the computation of attributes that describe the neighborhood of each sample within the input vector field. Compilation of these attributes forms a representation of the vector field samples in the attribute space. We project the attribute points onto the canonical 2D plane to enable interactive exploration of the vector field using a painting interface. The projection encodes the similarities between vector field points within the distances computed between their associated attribute points. The proposed method is performed at interactive rates for enhanced user experience and is completely flexible as showcased by the simultaneous identification of diverse feature types.
Resumo:
The most significant radiation field nonuniformity is the well-known Heel effect. This nonuniform beam effect has a negative influence on the results of computer-aided diagnosis of mammograms, which is frequently used for early cancer detection. This paper presents a method to correct all pixels in the mammography image according to the excess or lack on radiation to which these have been submitted as a result of the this effect. The current simulation method calculates the intensities at all points of the image plane. In the simulated image, the percentage of radiation received by all the points takes the center of the field as reference. In the digitized mammography, the percentages of the optical density of all the pixels of the analyzed image are also calculated. The Heel effect causes a Gaussian distribution around the anode-cathode axis and a logarithmic distribution parallel to this axis. Those characteristic distributions are used to determine the center of the radiation field as well as the cathode-anode axis, allowing for the automatic determination of the correlation between these two sets of data. The measurements obtained with our proposed method differs on average by 2.49 mm in the direction perpendicular to the anode-cathode axis and 2.02 mm parallel to the anode-cathode axis of commercial equipment. The method eliminates around 94% of the Heel effect in the radiological image and the objects will reflect their x-ray absorption. To evaluate this method, experimental data was taken from known objects, but could also be done with clinical and digital images.
Resumo:
A statistical data analysis methodology was developed to evaluate the field emission properties of many samples of copper oxide nanostructured field emitters. This analysis was largely done in terms of Seppen-Katamuki (SK) charts, field strength and emission current. Some physical and mathematical models were derived to describe the effect of small electric field perturbations in the Fowler-Nordheim (F-N) equation, and then to explain the trend of the data represented in the SK charts. The field enhancement factor and the emission area parameters showed to be very sensitive to variations in the electric field for most of the samples. We have found that the anode-cathode distance is critical in the field emission characterization of samples having a non-rigid nanostructure. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In this communication, we report on the formation of calcium hexahydroxodizincate dehydrate, CaZn(2)(OH)(6)center dot 2H(2)O (CZO) powders under microwave-hydrothermal (MH) conditions. These powders were analyzed by X-ray diffraction (XRD), Field-emission gum scanning electron microscopy (FEG-SEM), ultraviolet-visible (UV-vis) absorption spectroscopy and photoluminescence (PL) measurements. XRD patterns confirmed that the pure CZO phase was obtained after MH processing performed at 130 degrees C for 2 h. FEG-SEM micrographs indicated that the morphological modifications as well as the growth of CZO microparticles are governed by Ostwald-ripening and coalescence mechanisms. UV-vis spectra showed that this material have an indirect optical band gap. The pure CZO powders exhibited an yellow PL emission when excited by 350 nm wavelength at room temperature. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
Several protease inhibitors have reached the world market in the last fifteen years, dramatically improving the quality of life and life expectancy of millions of HIV-infected patients. In spite of the tremendous research efforts in this area, resistant HIV-1 variants are constantly decreasing the ability of the drugs to efficiently inhibit the enzyme. As a consequence, inhibitors with novel frameworks are necessary to circumvent resistance to chemotherapy. In the present work, we have created 3D QSAR models for a series of 82 HIV-1 protease inhibitors employing the comparative molecular field analysis (CoMFA) method. Significant correlation coefficients were obtained (q(2) = 0.82 and r(2) = 0.97), indicating the internal consistency of the best model, which was then used to evaluate an external test set containing 17 compounds. The predicted values were in good agreement with the experimental results, showing the robustness of the model and its substantial predictive power for untested compounds. The final QSAR model and the information gathered from the CoMFA contour maps should be useful for the design of novel anti-HIV agents with improved potency.
Resumo:
The count intercept is a robust method for the numerical analysis of fabrics Launeau and Robin (1996). It counts the number of intersections between a set of parallel scan lines and a mineral phase, which must be identified on a digital image. However, the method is only sensitive to boundaries and therefore supposes the user has some knowledge about their significance. The aim of this paper is to show that a proper grey level detection of boundaries along scan lines is sufficient to calculate the two-dimensional anisotropy of grain or crystal distributions without any particular image processing. Populations of grains and crystals usually display elliptical anisotropies in rocks. When confirmed by the intercept analysis, a combination of a minimum of 3 mean length intercept roses, taken on 3 more or less perpendicular sections, allows the calculation of 3-dimensional ellipsoids and the determination of their standard deviation with direction and intensity in 3 dimensions as well. The feasibility of this quick method is attested by numerous examples on theoretical objects deformed by active and passive deformation, on BSE images of synthetic magma flow, on drawing or direct analysis of thin section pictures of sandstones and on digital images of granites directly taken and measured in the field. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We present an efficient numerical methodology for the 31) computation of incompressible multi-phase flows described by conservative phase-field models We focus here on the case of density matched fluids with different viscosity (Model H) The numerical method employs adaptive mesh refinements (AMR) in concert with an efficient semi-implicit time discretization strategy and a linear, multi-level multigrid to relax high order stability constraints and to capture the flow`s disparate scales at optimal cost. Only five linear solvers are needed per time-step. Moreover, all the adaptive methodology is constructed from scratch to allow a systematic investigation of the key aspects of AMR in a conservative, phase-field setting. We validate the method and demonstrate its capabilities and efficacy with important examples of drop deformation, Kelvin-Helmholtz instability, and flow-induced drop coalescence (C) 2010 Elsevier Inc. All rights reserved
Resumo:
A myriad of methods are available for virtual screening of small organic compound databases. In this study we have successfully applied a quantitative model of consensus measurements, using a combination of 3D similarity searches (ROCS and EON), Hologram Quantitative Structure Activity Relationships (HQSAR) and docking (FRED, FlexX, Glide and AutoDock Vina), to retrieve cruzain inhibitors from collected databases. All methods were assessed individually and then combined in a Ligand-Based Virtual Screening (LBVS) and Target-Based Virtual Screening (TBVS) consensus scoring, using Receiving Operating Characteristic (ROC) curves to evaluate their performance. Three consensus strategies were used: scaled-rank-by-number, rank-by-rank and rank-by-vote, with the most thriving the scaled-rank-by-number strategy, considering that the stiff ROC curve appeared to be satisfactory in every way to indicate a higher enrichment power at early retrieval of active compounds from the database. The ligand-based method provided access to a robust and predictive HQSAR model that was developed to show superior discrimination between active and inactive compounds, which was also better than ROCS and EON procedures. Overall, the integration of fast computational techniques based on ligand and target structures resulted in a more efficient retrieval of cruzain inhibitors with desired pharmacological profiles that may be useful to advance the discovery of new trypanocidal agents.
Resumo:
In this study an optimization method for the design of combined solar and pellet heating systems is presented and evaluated. The paper describes the steps of the method by applying it for an example of system. The objective of the optimization was to find the design parameters that give the lowest auxiliary energy (pellet fuel + auxiliary electricity) and carbon monoxide (CO) emissions for a system with a typical load, a single family house in Sweden. Weighting factors have been used for the auxiliary energy use and CO emissions to give a combined target function. Different weighting factors were tested. The results show that extreme weighting factors lead to their own minima. However, it was possible to find factors that ensure low values for both auxiliary energy and CO emissions.
Resumo:
Internet research methods in nursing science are less developed than in other sciences. We choose to present an approach to conducting nursing research on an internet-based forum. This paper presents LiLEDDA, a six-step forum-based netnographic research method for nursing science. The steps consist of: 1. Literature review and identification of the research question(s); 2. Locating the field(s) online; 3. Ethical considerations; 4. Data gathering; 5. Data analysis and interpretation; and 6. Abstractions and trustworthiness. Traditional research approaches are limiting when studying non-normative and non-mainstream life-worlds and their cultures. We argue that it is timely to develop more up-to-date research methods and study designs applicable to nursing science that reflect social developments and human living conditions that tend to be increasingly online-based.
Resumo:
O estudo das novas tecnologias de ensino/aprendizagem utilizadas pelos docentes em Administração no cotidiano da sala de aula é uma das temáticas mais debatidas na academia. Essas tecnologias são apreendidas e utilizadas por discentes e docentes incorporando-as, ou não, nas suas práticas educacionais. Sua descrição e compreensão são fundamentais para o desenvolvimento do campo organizacional. Descrever, no entanto, essas tecnologias utilizando as palavras do professor na condução da sua classe é um dos propósitos centrais desenvolvidos nessa pesquisa. Discutir, se as concepções pedagógicas dos cursos de Administração perpassam pelo sentido valorativo e conceitual daquilo que propõe a educação formal tornou-se evidente no decurso dissertativo. Problematizar o pensamento metafísico, a epistemologia dialética e o paradigma funcionalista com a episteme de Michel Foucault foi um ponto de chegada: Arqueologia. Argumenta-se de que forma as forças contraditórias e paradoxais existentes na educação garantem, de um lado, maiores possibilidades de subjetividades com atitudes críticas e, de outro, inibem a abordagem crítica e reflexiva com pedagogias instrumentais e utilitaristas. Essa relação multifacetada e permeada de variabilidade levou-nos a estudá-lo a partir da razão dialética. Tal vertente metodológica que possibilita a construção sintética e compreensão de seus fatores contributivos por meio da justaposição de tese e antítese que situam-se em condições opostas. Entretanto, não há interpretação. Os resultados obtidos na pesquisa de campo foram classificados em três categorias próprias ao método escolhido: totalidade, sistema de contradição e negação da negação. Em sua análise, optamos não explicar, mas sim, problematizar o paradigma dominante nos cursos de Administração. Sugerindo a sempre renovada possibilidade de atitude crítica nas práticas cotidianas dos docentes entrevistados. Dentro deste contexto essencialmente paradoxal, compreendemos o sentido dialético que os movimenta, acomete e surpreende a cada dia na sala de aula: da mesma forma que as práticas os limitam devido a conformação própria da educação, com seus instrumentos doutrinários; ao mesmo tempo, assegura-lhes, a possibilidade da construção ensino/aprendizagem com seus inéditos viáveis. Dimensões que apresentam-se como contraponto a uma literatura epistemológica funcionalista dominante e consideram um complexo multifacetado que, ao modelo da episteme de Michel Foucault, sugerem provocações muito além do bem e do mal.