806 resultados para computational journalism


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamic core-shell nanoparticles have received increasing attention in recent years. This paper presents a detailed study of Au-Hg nanoalloys, whose composing elements show a large difference in cohesive energy. A simple method to prepare Au@Hg particles with precise control over the composition up to 15 atom% mercury is introduced, based on reacting a citrate stabilized gold sol with elemental mercury. Transmission electron microscopy shows an increase of particle size with increasing mercury content and, together with X-ray powder diffraction, points towards the presence of a core-shell structure with a gold core surrounded by an Au-Hg solid solution layer. The amalgamation process is described by pseudo-zero-order reaction kinetics, which indicates slow dissolution of mercury in water as the rate determining step, followed by fast scavenging by nanoparticles in solution. Once adsorbed at the surface, slow diffusion of Hg into the particle lattice occurs, to a depth of ca. 3 nm, independent of Hg concentration. Discrete dipole approximation calculations relate the UV-vis spectra to the microscopic details of the nanoalloy structure. Segregation energies and metal distribution in the nanoalloys were modeled by density functional theory calculations. The results indicate slow metal interdiffusion at the nanoscale, which has important implications for synthetic methods aimed at core-shell particles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The reduction in the amount of food available for European avian scavengers as a consequence of restrictive public health policies is a concern for managers and conservationists. Since 2002, the application of several sanitary regulations has limited the availability of feeding resources provided by domestic carcasses, but theoretical studies assessing whether the availability of food resources provided by wild ungulates are enough to cover energetic requirements are lacking. Methodology/Findings We assessed food provided by a wild ungulate population in two areas of NE Spain inhabited by three vulture species and developed a P System computational model to assess the effects of the carrion resources provided on their population dynamics. We compared the real population trend with to a hypothetical scenario in which only food provided by wild ungulates was available. Simulation testing of the model suggests that wild ungulates constitute an important food resource in the Pyrenees and the vulture population inhabiting this area could grow if only the food provided by wild ungulates would be available. On the contrary, in the Pre-Pyrenees there is insufficient food to cover the energy requirements of avian scavenger guilds, declining sharply if biomass from domestic animals would not be available. Conclusions/Significance Our results suggest that public health legislation can modify scavenger population trends if a large number of domestic ungulate carcasses disappear from the mountains. In this case, food provided by wild ungulates could be not enough and supplementary feeding could be necessary if other alternative food resources are not available (i.e. the reintroduction of wild ungulates), preferably in European Mediterranean scenarios sharing similar and socio-economic conditions where there are low densities of wild ungulates. Managers should anticipate the conservation actions required by assessing food availability and the possible scenarios in order to make the most suitable decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collision-induced dissociation (CID) of peptides using tandem mass spectrometry (MS) has been used to determine the identity of peptides and other large biological molecules. Mass spectrometry (MS) is a useful tool for determining the identity of molecules based on their interaction with electromagnetic fields. If coupled with another method like infrared (IR) vibrational spectroscopy, MS can provide structural information, but in its own right, MS can only provide the mass-to-charge (m/z) ratio of the fragments produced, which may not be enough information to determine the mechanism of the collision-induced dissociation (CID) of the molecule. In this case, theoretical calculations provide a useful companion for MS data and yield clues about the energetics of the dissociation. In this study, negative ion electrospray tandem MS was used to study the CID of the deprotonated dipeptide glycine-serine (Gly-Ser). Though negative ion MS is not as popular a choice as positive ion MS, studies by Bowie et al. show that it yields unique clues about molecular structure which complement positive ion spectroscopy, such as characteristic fragmentations like the loss of formaldehyde from the serine residue.2 The increase in the collision energy in the mass spectrometer alters the flexibility of the dipeptide backbone, enabling isomerizations (reactions not resulting in a fragment loss) and dissociations to take place. The mechanism of the CID of Gly-Ser was studied using two computational methods, B3LYP/6-311+G* and M06-2X/6-311++G**. The main pathway for molecular dissociation was analyzed in 5 conformers in an attempt to verify the initial mechanism proposed by Dr. James Swan after examination of the MS data. The results suggest that the loss of formaldehyde from serine, which Bowie et al. indicates is a characteristic of the presence of serine in a protein residue, is an endothermic reaction that is made possible by the conversion of the translational energy of the ion into internal energy as the ion collides with the inert collision gas. It has also been determined that the M06-2X functional¿s improved description of medium and long-range correlation makes it more effective than the B3LYP functional at finding elusive transition states. M06-2X also more accurately predicts the energy of those transition states than does B3LYP. A second CID mechanism, which passes through intermediates with the same m/z ratio as the main pathway for molecular dissociation, but different structures, including a diketopiperazine intermediate, was also studied. This pathway for molecular dissociation was analyzed with 3 conformers and the M06-2X functional, due to its previously determined effectiveness. The results suggest that the latter pathway, which meets the same intermediate masses as the first mechanism, is lower in overall energy and therefore a more likely pathway of dissociation than the first mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cold-formed steel (CFS) combined with wood sheathing, such as oriented strand board (OSB), forms shear walls that can provide lateral resistance to seismic forces. The ability to accurately predict building deformations in damaged states under seismic excitations is a must for modern performance-based seismic design. However, few static or dynamic tests have been conducted on the non-linear behavior of CFS shear walls. Thus, the purpose of this research work is to provide and demonstrate a fastener-based computational model of CFS wall models that incorporates essential nonlinearities that may eventually lead to improvement of the current seismic design requirements. The approach is based on the understanding that complex interaction of the fasteners with the sheathing is an important factor in the non-linear behavior of the shear wall. The computational model consists of beam-column elements for the CFS framing and a rigid diaphragm for the sheathing. The framing and sheathing are connected with non-linear zero-length fastener elements to capture the OSB sheathing damage surrounding the fastener area. Employing computational programs such as OpenSees and MATLAB, 4 ft. x 9 ft., 8 ft. x 9 ft. and 12 ft. x 9 ft. shear wall models are created, and monotonic lateral forces are applied to the computer models. The output data are then compared and analyzed with the available results of physical testing. The results indicate that the OpenSees model can accurately capture the initial stiffness, strength and non-linear behavior of the shear walls.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The group analysed some syntactic and phonological phenomena that presuppose the existence of interrelated components within the lexicon, which motivate the assumption that there are some sublexicons within the global lexicon of a speaker. This result is confirmed by experimental findings in neurolinguistics. Hungarian speaking agrammatic aphasics were tested in several ways, the results showing that the sublexicon of closed-class lexical items provides a highly automated complex device for processing surface sentence structure. Analysing Hungarian ellipsis data from a semantic-syntactic aspect, the group established that the lexicon is best conceived of being as split into at least two main sublexicons: the store of semantic-syntactic feature bundles and a separate store of sound forms. On this basis they proposed a format for representing open-class lexical items whose meanings are connected via certain semantic relations. They also proposed a new classification of verbs to account for the contribution of the aspectual reading of the sentence depending on the referential type of the argument, and a new account of the syntactic and semantic behaviour of aspectual prefixes. The partitioned sets of lexical items are sublexicons on phonological grounds. These sublexicons differ in terms of phonotactic grammaticality. The degrees of phonotactic grammaticality are tied up with the problem of psychological reality, of how many degrees of this native speakers are sensitive to. The group developed a hierarchical construction network as an extension of the original General Inheritance Network formalism and this framework was then used as a platform for the implementation of the grammar fragments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. We detail some of the design decisions, software paradigms and operational strategies that have allowed a small number of researchers to provide a wide variety of innovative, extensible, software solutions in a relatively short time. The use of an object oriented programming paradigm, the adoption and development of a software package system, designing by contract, distributed development and collaboration with other projects are elements of this project's success. Individually, each of these concepts are useful and important but when combined they have provided a strong basis for rapid development and deployment of innovative and flexible research software for scientific computation. A primary objective of this initiative is achievement of total remote reproducibility of novel algorithmic research results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In epidemiological work, outcomes are frequently non-normal, sample sizes may be large, and effects are often small. To relate health outcomes to geographic risk factors, fast and powerful methods for fitting spatial models, particularly for non-normal data, are required. We focus on binary outcomes, with the risk surface a smooth function of space. We compare penalized likelihood models, including the penalized quasi-likelihood (PQL) approach, and Bayesian models based on fit, speed, and ease of implementation. A Bayesian model using a spectral basis representation of the spatial surface provides the best tradeoff of sensitivity and specificity in simulations, detecting real spatial features while limiting overfitting and being more efficient computationally than other Bayesian approaches. One of the contributions of this work is further development of this underused representation. The spectral basis model outperforms the penalized likelihood methods, which are prone to overfitting, but is slower to fit and not as easily implemented. Conclusions based on a real dataset of cancer cases in Taiwan are similar albeit less conclusive with respect to comparing the approaches. The success of the spectral basis with binary data and similar results with count data suggest that it may be generally useful in spatial models and more complicated hierarchical models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The degree of polarization of a refected field from active laser illumination can be used for object identifcation and classifcation. The goal of this study is to investigate methods for estimating the degree of polarization for refected fields with active laser illumination, which involves the measurement and processing of two orthogonal field components (complex amplitudes), two orthogonal intensity components, and the total field intensity. We propose to replace interferometric optical apparatuses with a computational approach for estimating the degree of polarization from two orthogonal intensity data and total intensity data. Cramer-Rao bounds for each of the three sensing modalities with various noise models are computed. Algebraic estimators and maximum-likelihood (ML) estimators are proposed. Active-set algorithm and expectation-maximization (EM) algorithm are used to compute ML estimates. The performances of the estimators are compared with each other and with their corresponding Cramer-Rao bounds. Estimators for four-channel polarimeter (intensity interferometer) sensing have a better performance than orthogonal intensities estimators and total intensity estimators. Processing the four intensities data from polarimeter, however, requires complicated optical devices, alignment, and four CCD detectors. It only requires one or two detectors and a computer to process orthogonal intensities data and total intensity data, and the bounds and estimator performances demonstrate that reasonable estimates may still be obtained from orthogonal intensities or total intensity data. Computational sensing is a promising way to estimate the degree of polarization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental studies on epoxies report that the microstructure consists of highly crosslinked localized regions connected with a dispersed phase of low crosslink density. The various thermo-mechanical properties of epoxies might be affected by the crosslink distribution. But as experiments cannot report the exact number of crosslinked covalent bonds present in the structure, molecular dynamics is thus being used in this work to determine the influence of crosslink distribution on thermo-mechanical properties. Molecular dynamics and molecular mechanics simulations are used to establish wellequilibrated molecular models of EPON 862-DETDA epoxy system with a range of crosslink densities and various crosslink distributions. Crosslink distributions are being varied by forming differently crosslinked localized clusters and then by forming different number of crosslinks interconnecting the clusters. Simulations are subsequently used to predict the volume shrinkage, thermal expansion coefficients, and elastic properties of each of the crosslinked systems. The results indicate that elastic properties increase with increasing levels of overall crosslink density and the thermal expansion coefficient decreases with overall crosslink density, both above and below the glass transition temperature. Elastic moduli and coefficients of linear thermal expansion values were found to be different for systems with same overall crosslink density but having different crosslink distributions, thus indicating an effect of the epoxy nanostructure on physical properties. The values of thermo-mechanical properties for all the crosslinked systems are within the range of values reported in literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation presents an effective quasi one-dimensional (1-D) computational simulation tool and a full two-dimensional (2-D) computational simulation methodology for steady annular/stratified internal condensing flows of pure vapor. These simulation tools are used to investigate internal condensing flows in both gravity as well as shear driven environments. Through accurate numerical simulations of the full two dimensional governing equations, results for laminar/laminar condensing flows inside mm-scale ducts are presented. The methodology has been developed using MATLAB/COMSOL platform and is currently capable of simulating film-wise condensation for steady (and unsteady flows). Moreover, a novel 1-D solution technique, capable of simulating condensing flows inside rectangular and circular ducts with different thermal boundary conditions is also presented. The results obtained from the 2-D scientific tool and 1-D engineering tool, are validated and synthesized with experimental results for gravity dominated flows inside vertical tube and inclined channel; and, also, for shear/pressure driven flows inside horizontal channels. Furthermore, these simulation tools are employed to demonstrate key differences of physics between gravity dominated and shear/pressure driven flows. A transition map that distinguishes shear driven, gravity driven, and “mixed” driven flow zones within the non-dimensional parameter space that govern these duct flows is presented along with the film thickness and heat transfer correlations that are valid in these zones. It has also been shown that internal condensing flows in a micro-meter scale duct experiences shear driven flow, even in different gravitational environments. The full 2-D steady computational tool has been employed to investigate the length of annularity. The result for a shear driven flow in a horizontal channel shows that in absence of any noise or pressure fluctuation at the inlet, the onset of non-annularity is partly due to insufficient shear at the liquid-vapor interface. This result is being further corroborated/investigated by R. R. Naik with the help of the unsteady simulation tool. The condensing flow results and flow physics understanding developed through these simulation tools will be instrumental in reliable design of modern micro-scale and spacebased thermal systems.