998 resultados para Solution mining.
Resumo:
Solution calorimetry offers a reproducible technique for measuring the enthalpy of solution (ΔsolH) of a solute dissolving into a solvent. The ΔsolH of two solutes, propranolol HCl and mannitol were determined in simulated intestinal fluid (SIF) solutions designed to model the fed and fasted states within the gut, and in Hanks’ balanced salt solution (HBSS) of varying pH. The bile salt and lipid within the SIF solutions formed mixed micelles. Both solutes exhibited endothermic reactions in all solvents. The ΔsolH for propranolol HCl in the SIF solutions differed from those in the HBSS and was lower in the fed state than the fasted state SIF solution, revealing an interaction between propranolol and the micellar phase in both SIF solutions. In contrast, for mannitol the ΔsolH was constant in all solutions indicating minimal interaction between mannitol and the micellar phases of the SIF solutions. In this study, solution calorimetry proved to be a simple method for measuring the enthalpy associated with the dissolution of model drugs in complex biological media such as SIF solutions. In addition, the derived power–time curves allowed the time taken for the powdered solutes to form solutions to be estimated.
Resumo:
Steep orography can cause noisy solutions and instability in models of the atmosphere. A new technique for modelling flow over orography is introduced which guarantees curl free gradients on arbitrary grids, implying that the pressure gradient term is not a spurious source of vorticity. This mimetic property leads to better hydrostatic balance and better energy conservation on test cases using terrain following grids. Curl-free gradients are achieved by using the co-variant components of velocity over orography rather than the usual horizontal and vertical components. In addition, gravity and acoustic waves are treated implicitly without the need for mean and perturbation variables or a hydrostatic reference profile. This enables a straightforward description of the implicit treatment of gravity waves. Results are presented of a resting atmosphere over orography and the curl-free pressure gradient formulation is advantageous. Results of gravity waves over orography are insensitive to the placement of terrain-following layers. The model with implicit gravity waves is stable in strongly stratified conditions, with N∆t up to at least 10 (where N is the Brunt-V ̈ais ̈al ̈a frequency). A warm bubble rising over orography is simulated and the curl free pressure gradient formulation gives much more accurate results for this test case than a model without this mimetic property.
Resumo:
The book is concerned with the rise of the Greek Golden Dawn. Although most literature focuses on demand and supply-side explanations, this book progresses beyond the state of the art by examining the Golden Dawn as an outlier and focusing on political culture as an explanation for its dramatic rise.
Resumo:
For its advocates, corporate social responsibility (CSR) represents a powerful tool through which business and particularly multinationals can play a more direct role in global sustainable development. For its critics, however, CSR rarely goes beyond business as usual, and is often a cover for business practices with negative implications for communities and the environment. This paper explores the relationship between CSR and sustainable development in the context of mining in Namibia. Drawing upon extant literatures on the geographies of responsibility, and referencing in-country empirical case-study research, a critical relational lens is applied to consider their interaction both historically and in the present.
Resumo:
Artisanal and small-scale mining (ASM) is an activity intimately associated with social deprivation and environmental degradation, including deforestation. This paper examines ASM and deforestation using a broadly poststructural political ecology framework. Hegemonic discourses are shown to consistently influence policy direction, particularly in emerging approaches such as Corporate Social Responsibility and the Forest Stewardship Council. A review of alternative discourses reveals that the poststructural method is useful for critiquing the international policy arena but does not inform new approaches. Synthesis of the analysis leads to conclusions that echo a growing body of literature advocating for policies to become increasingly sensitive to local contexts, synergistic between actors at difference scales, and to be integrated across sectors.
Resumo:
This paper addresses the economics of Enhanced Landfill Mining (ELFM) both from a private point of view as well as from a society perspective. The private potential is assessed using a case study for which an investment model is developed to identify the impact of a broad range of parameters on the profitability of ELFM. We found that especially variations in Waste-to-Energy (WtE efficiency, electricity price, CO2-price, WtE investment and operational costs) and ELFM support explain the variation in economic profitability measured by the Internal Rate of Return. To overcome site-specific parameters we also evaluated the regional ELFM potential for the densely populated and industrial region of Flanders (north of Belgium). The total number of potential ELFM sites was estimated using a 5-step procedure and a simulation tool was developed to trade-off private costs and benefits. The analysis shows that there is a substantial economic potential for ELFM projects on the wider regional level. Furthermore, this paper also reviews the costs and benefits from a broader perspective. The carbon footprint of the case study was mapped in order to assess the project’s net impact in terms of greenhouse gas emissions. Also the impacts of nature restoration, soil remediation, resource scarcity and reduced import dependence were valued so that they can be used in future social cost-benefit analysis. Given the complex trade-off between economic, social and environmental issues of ELFM projects, we conclude that further refinement of the methodological framework and the development of the integrated decision tools supporting private and public actors, are necessary.
Shared ownership and affordable housing: a political solution in search of a planning justification?
Resumo:
Traditional dictionary learning algorithms are used for finding a sparse representation on high dimensional data by transforming samples into a one-dimensional (1D) vector. This 1D model loses the inherent spatial structure property of data. An alternative solution is to employ Tensor Decomposition for dictionary learning on their original structural form —a tensor— by learning multiple dictionaries along each mode and the corresponding sparse representation in respect to the Kronecker product of these dictionaries. To learn tensor dictionaries along each mode, all the existing methods update each dictionary iteratively in an alternating manner. Because atoms from each mode dictionary jointly make contributions to the sparsity of tensor, existing works ignore atoms correlations between different mode dictionaries by treating each mode dictionary independently. In this paper, we propose a joint multiple dictionary learning method for tensor sparse coding, which explores atom correlations for sparse representation and updates multiple atoms from each mode dictionary simultaneously. In this algorithm, the Frequent-Pattern Tree (FP-tree) mining algorithm is employed to exploit frequent atom patterns in the sparse representation. Inspired by the idea of K-SVD, we develop a new dictionary update method that jointly updates elements in each pattern. Experimental results demonstrate our method outperforms other tensor based dictionary learning algorithms.
Resumo:
Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.
Resumo:
Guest Editorial
Resumo:
Social network has gained remarkable attention in the last decade. Accessing social network sites such as Twitter, Facebook LinkedIn and Google+ through the internet and the web 2.0 technologies has become more affordable. People are becoming more interested in and relying on social network for information, news and opinion of other users on diverse subject matters. The heavy reliance on social network sites causes them to generate massive data characterised by three computational issues namely; size, noise and dynamism. These issues often make social network data very complex to analyse manually, resulting in the pertinent use of computational means of analysing them. Data mining provides a wide range of techniques for detecting useful knowledge from massive datasets like trends, patterns and rules [44]. Data mining techniques are used for information retrieval, statistical modelling and machine learning. These techniques employ data pre-processing, data analysis, and data interpretation processes in the course of data analysis. This survey discusses different data mining techniques used in mining diverse aspects of the social network over decades going from the historical techniques to the up-to-date models, including our novel technique named TRCM. All the techniques covered in this survey are listed in the Table.1 including the tools employed as well as names of their authors.
Resumo:
There is something peculiar about aesthetic testimony. It seems more difficult to gain knowledge of aesthetic properties based solely upon testimony than it is in the case of other types of property. In this paper, I argue that we can provide an adequate explanation at the level of the semantics of aesthetic language, without defending any substantive thesis in epistemology or about aesthetic value/judgement. If aesthetic predicates are given a non-invariantist semantics, we can explain the supposed peculiar difficulty with aesthetic testimony.
Resumo:
Re-establishing nutrient-cycling is often a key goal of mine-site restoration. This goal can be achieved by applying fertilisers (particularly P) in combination with seeding N-fixing legumes. However, the effect of this strategy on other key restoration goals such as the establishment and growth of non-leguminous species has received little attention. We investigated the effects of P-application rates either singly, or in combination with seeding seven large understorey legume species, on jarrah forest restoration after bauxite mining. Five years after P application and seeding, legume species richness, density and cover were higher in the legume-seeded treatment. However, the increased establishment of legumes did not lead to increased soil N. Increasing P-application rates from 0 to 80 kg P ha−1 did not affect legume species richness, but significantly reduced legume density and increased legume cover: cover was maximal (∼50%) where 80 kg P ha−1 had been applied with large legume seeds. Increasing P-application had no effect on species richness of non-legume species, but increased the density of weeds and native ephemerals. Cover of non-legume species decreased with increasing P-application rates and was lower in plots where large legumes had been seeded compared with non-seeded plots. There was a significant legume × P interaction on weed and ephemeral density: at 80 kg P ha−1 the decline in density of these groups was greatest where legumes were seeded. In addition, the decline in cover for non-legume species with increasing P was greatest when legumes were seeded. Applying 20 kg P ha−1 significantly increased tree growth compared with tree growth in unfertilised plots, but growth was not increased further at 80 kg ha−1 and tree growth was not affected by seeding large legumes. Taken together, these data indicate that 80 kg ha−1 P-fertiliser in combination with (seeding) large legumes maximised vegetation cover at five years but could be suboptimal for re-establishing a jarrah forest community that, like unmined forest, contains a diverse community of slow-growing re-sprouter species. The species richness and cover of non-legume understorey species, especially the resprouters, was highest in plots that received either 0 or 20 kg ha−1 P and where large legumes had not been seeded. Therefore, our findings suggest that moderation of P-fertiliser and legumes could be the best strategy to fulfil the multiple restoration goals of establishing vegetation cover, while at the same time maximising tree growth and species richness of restored forest.