137 resultados para Approximate solutions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present study, we examined whether and how brief viewing of positive and negative images influences subsequent understanding of solutions to insight problems. For each trial, participants were first presented with an insight problem and then briefly viewed a task-irrelevant positive, negative, or neutral image (660 ms), which was followed by the solution to the problem. In our behavioral study (Study 1), participants were faster to report that they understood the solutions following positive images, and were slower to report it following negative images. A subsequent fMRI study (Study 2) revealed enhanced activity in the angular gyrus and medial prefrontal cortex (MPFC) while viewing solutions following positive, as compared with negative, images. In addition, greater activation of the angular gyrus was associated with more rapid understanding of the solutions. These results suggest that brief viewing of positive images enhances activity in the angular gyrus and MPFC, which results in facilitation of understanding solutions to insight problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo algorithms often aim to draw from a distribution π by simulating a Markov chain with transition kernel P such that π is invariant under P. However, there are many situations for which it is impractical or impossible to draw from the transition kernel P. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace P by an approximation Pˆ. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ’close’ the chain given by the transition kernel Pˆ is to the chain given by P . We apply these results to several examples from spatial statistics and network analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The incorporation of cobalt in mixed metal carbonates is a possible route to the immobilization of this toxic element in the environment. However, the thermodynamics of (Ca,Co)CO3 solid solutions are still unclear due to conflicting data from experiment and from the observation of natural ocurrences. We report here the results of a computer simulation study of the mixing of calcite (CaCO3) and spherocobaltite (CoCO3), using density functional theory calculations. Our simulations suggest that previously proposed thermodynamic models, based only on the range of observed compositions, significantly overestimate the solubility between the two solids and therefore underestimate the extension of the miscibility gap under ambient conditions. The enthalpy of mixing of the disordered solid solution is strongly positive and moderately asymmetric: calcium incorporation in spherocobaltite is more endothermic than cobalt incorporation in calcite. Ordering of the impurities in (0001) layers is energetically favourable with respect to the disordered solid solution at low temperatures and intermediate compositions, but the ordered phase is still unstable to demixing. We calculate the solvus and spinodal lines in the phase diagram using a sub-regular solution model, and conclude that many Ca1-xCoxCO3 mineral solid solutions (with observed compositions of up to x=0.027, and above x=0.93) are metastable with respect to phase separation. We also calculate solid/aqueous distribution coefficients to evaluate the effect of the strong non-ideality of mixing on the equilibrium with aqueous solution, showing that the thermodynamically-driven incorporation of cobalt in calcite (and of calcium in spherocobaltite) is always very low, regardless of the Co/Ca ratio of the aqueous environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that CyMe4-BTPhen-functionalized silica-coated maghemite (c-Fe2O3) magnetic nanoparticles (MNPs) are capable of quantitative separation of Am(III) from Eu(III) from HNO3 solutions. These MNPs also show a small but significant selectivity for Am(III) over Cm(III) with a separation factor of around 2 in 4 M HNO3. The water molecule in the cavity of the BTPhen may also play an important part in the selectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that CyMe4-BTPhen-functionalized silica-coated maghemite (c-Fe2O3) magnetic nanoparticles (MNPs) are capable of quantitative separation of Am(III) from Eu(III) from HNO3 solutions. These MNPs also show a small but significant selectivity for Am(III) over Cm(III) with a separation factor of around 2 in 4 M HNO3. The water molecule in the cavity of the BTPhen may also play an important part in the selectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrogels are polymeric materials used in many pharmaceutical and biomedical applications due to their ability to form 3D hydrophilic polymeric networks, which can absorb large amounts of water. In the present work, polyethylene glycols (PEG) were introduced into the hydrogel liquid phase in order to improve the mechanical properties of hydrogels composed of 2-hydroxyethylacrylate and 2-hydroxyethylmethacrylate (HEA–HEMA) synthesized with different co-monomer compositions and equilibrated in water or in 20 % water–PEG 400 and 600 solutions. The thermoanalytical techniques [differential scanning calorimetry (DSC) and thermogravimetry (TG)] were used to evaluate the amount and properties of free and bound water in HEA–HEMA hydrogels. The internal structure and the mechanical properties of hydrogels were studied using scanning electron microscopy and friability assay. TG “loss-on-drying” experiments were applied to study the water-retention properties of hydrogels, whereas the combination of TG and DSC allowed estimating the total amount of freezable and non-freezing water in hydrogels. The results show that the addition of viscous co-solvent (PEG) to the liquid medium results in significant improvement of the mechanical properties of HEA–HEMA hydrogels and also slightly retards the water loss from the hydrogels. A redistribution of free and bound water in the hydrogels equilibrated in mixed solutions containing 20 vol% of PEGs takes place.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contemporary research in generative second language (L2) acquisition has attempted to address observable target-deviant aspects of L2 grammars within a UG-continuity framework (e.g. Lardiere 2000; Schwartz 2003; Sprouse 2004; Prévost & White 1999, 2000). With the aforementioned in mind, the independence of pragmatic and syntactic development, independently observed elsewhere (e.g. Grodzinsky & Reinhart 1993; Lust et al. 1986; Pacheco & Flynn 2005; Serratrice, Sorace & Paoli 2004), becomes particularly interesting. In what follows, I examine the resetting of the Null-Subject Parameter (NSP) for English learners of L2 Spanish. I argue that insensitivity to associated discoursepragmatic constraints on the discursive distribution of overt/null subjects accounts for what appear to be particular errors as a result of syntactic deficits. It is demonstrated that despite target-deviant performance, the majority must have native-like syntactic competence given their knowledge of the Overt Pronoun Constraint (Montalbetti 1984), a principle associated with the Spanish-type setting of the NSP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate Bayesian computation (ABC) is a popular family of algorithms which perform approximate parameter inference when numerical evaluation of the likelihood function is not possible but data can be simulated from the model. They return a sample of parameter values which produce simulations close to the observed dataset. A standard approach is to reduce the simulated and observed datasets to vectors of summary statistics and accept when the difference between these is below a specified threshold. ABC can also be adapted to perform model choice. In this article, we present a new software package for R, abctools which provides methods for tuning ABC algorithms. This includes recent dimension reduction algorithms to tune the choice of summary statistics, and coverage methods to tune the choice of threshold. We provide several illustrations of these routines on applications taken from the ABC literature.