952 resultados para Strategy formulation
Resumo:
Realizing scalable performance on high performance computing systems is not straightforward for single-phenomenon codes (such as computational fluid dynamics [CFD]). This task is magnified considerably when the target software involves the interactions of a range of phenomena that have distinctive solution procedures involving different discretization methods. The problems of addressing the key issues of retaining data integrity and the ordering of the calculation procedures are significant. A strategy for parallelizing this multiphysics family of codes is described for software exploiting finite-volume discretization methods on unstructured meshes using iterative solution procedures. A mesh partitioning-based SPMD approach is used. However, since different variables use distinct discretization schemes, this means that distinct partitions are required; techniques for addressing this issue are described using the mesh-partitioning tool, JOSTLE. In this contribution, the strategy is tested for a variety of test cases under a wide range of conditions (e.g., problem size, number of processors, asynchronous / synchronous communications, etc.) using a variety of strategies for mapping the mesh partition onto the processor topology.
Resumo:
Parallel computing is now widely used in numerical simulation, particularly for application codes based on finite difference and finite element methods. A popular and successful technique employed to parallelize such codes onto large distributed memory systems is to partition the mesh into sub-domains that are then allocated to processors. The code then executes in parallel, using the SPMD methodology, with message passing for inter-processor interactions. In order to improve the parallel efficiency of an imbalanced structured mesh CFD code, a new dynamic load balancing (DLB) strategy has been developed in which the processor partition range limits of just one of the partitioned dimensions uses non-coincidental limits, as opposed to coincidental limits. The ‘local’ partition limit change allows greater flexibility in obtaining a balanced load distribution, as the workload increase, or decrease, on a processor is no longer restricted by the ‘global’ (coincidental) limit change. The automatic implementation of this generic DLB strategy within an existing parallel code is presented in this chapter, along with some preliminary results.
Resumo:
In this past decade finite volume (FV) methods have increasingly been used for the solution of solid mechanics problems. This contribution describes a cell vertex finite volume discretisation approach to the solution of geometrically nonlinear (GNL) problems. These problems, which may well have linear material properties, are subject to large deformation. This requires a distinct formulation, which is described in this paper together with the solution strategy for GNL problem. The competitive performance for this procedure against the conventional finite element (FE) formulation is illustrated for a three dimensional axially loaded column.
Resumo:
The first phase in the sign, development and implementation of a comprehensive computational model of a copper stockpile leach process is presented. The model accounts for transport phenomena through the stockpile, reaction kinetics for the important mineral species, oxgen and bacterial effects on the leach reactions, plus heat, energy and acid balances for the overall leach process. The paper describes the formulation of the leach process model and its implementation in PHYSICA+, a computational fluid dynamic (CFD) software environment. The model draws on a number of phenomena to represent the competing physical and chemical features active in the process model. The phenomena are essentially represented by a three-phased (solid liquid gas) multi-component transport system; novel algorithms and procedures are required to solve the model equations, including a methodology for dealing with multiple chemical species with different reaction rates in ore represented by multiple particle size fractions. Some initial validation results and application simulations are shown to illustrate the potential of the model.
Resumo:
The purpose of the present study was to use attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR) and target factor analysis (TFA) to investigate the permeation of model drugs and formulation components through Carbosil® membrane and human skin. Diffusion studies of saturated solutions in 50:50 water/ethanol of methyl paraben (MP), ibuprofen (IBU) and caffeine (CF) were performed on Carbosil® membrane. The spectroscopic data were analysed by target factor analysis, and evolution profiles of the signal for each component (i.e. the drug, water, ethanol and membrane) over time were obtained. Results showed that the data were successfully deconvoluted as correlations between factors from the data and reference spectra of the components, were above 0.8 in all cases. Good reproducibility over three runs for the evolution profiles was obtained. From the evolution profiles it was observed that water diffused better through the Carbosil® membrane than ethanol, confirming the hydrophilic properties of the Carbosil® membrane used. IBU diffused slower compared with MP and CF. The evolution profile of CF was very similar to that of water, probably because of the high solubility of CF in water, indicating that both compounds are diffusing concurrently. The second part of the work involved a study of the evolution profiles of the components of a commercial topical gel containing 5% (w/w) of ibuprofen as it permeated through human skin. Although the system was much more complex, data were still successfully deconvoluted and the different components of the formulation identified except for benzyl alcohol which might be attributed to the low concentrations of benzyl alcohol used in topical formulations. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This article examines the concepts, definitions, policies, and practices of heritage in a contemporary context. Within recent years, there have been significant shifts in our understandings and applications of heritage concepts and policies in the modern world. ‘Heritage’ emerged as a buzz word in international policy arenas in the 1980s and early 1990s, and has since weathered the vagaries of turbulent definitional and governance–nomenclature storms, as traditional debates about ‘what it is and what it is not’ reverberate around academia and state agencies alike. Policy and funding structures for heritage are determined by the classifications used to define them in various countries. Typically, reference is made to ‘built heritage’, ‘natural heritage’, and ‘intangible heritage’, loosely reflecting buildings, landscapes, and culture. Aspects of heritage are used by the cultural and tourism industries to add economic value, through heritage tourism sites, museums, and other activities. The cultural tourism product is often anchored around notions of heritage, and in postmodern, post-tourist societies, boundaries between culture, (travel) space, and identities are increasingly blurred. Issues of authenticity become important in the representation of heritage, and questions are asked about the validity of nostalgia versus realism. The role of heritage is examined in the context of identity formulation at individual and nation-state levels, and the political aspects of this are also discussed. Finally, heritage conservation is assessed through an examination of UNESCO’s World Heritage Site listing and protection strategy. In a changing world, new constructs of heritage, identity, authenticity, and representation will continue to emerge as meanings are constantly renegotiated over time and space.
Resumo:
For structural health monitoring it is impractical to identify a large structure with complete measurement due to limited number of sensors and difficulty in field instrumentation. Furthermore, it is not desirable to identify a large number of unknown parameters in a full system because of numerical difficulty in convergence. A novel substructural strategy was presented for identification of stiffness matrices and damage assessment with incomplete measurement. The substructural approach was employed to identify large systems in a divide-and-conquer manner. In addition, the concept of model condensation was invoked to avoid the need for complete measurement, and the recovery process to obtain the full set of parameters was formulated. The efficiency of the proposed method is demonstrated numerically through multi-storey shear buildings subjected to random force. A fairly large structural system with 50 DOFs was identified with good results, taking into consideration the effects of noisy signals and the limited number of sensors. Two variations of the method were applied, depending on whether the sensor could be repositioned. The proposed strategy was further substantiated experimentally using an eight-storey steel plane frame model subjected to shaker and impulse hammer excitations. Both numerical and experimental results have shown that the proposed substructural strategy gave reasonably accurate identification in terms of locating and quantifying structural damage.
Resumo:
An innovative methodology has been used for the formulation development of Cyclosporine A (CyA) nanoparticles. In the present study the static mixer technique, which is a novel method for producing nanoparticles, was employed. The formulation optimum was calculated by the modified Shepard's method (MSM), an advanced data analysis technique not adopted so far in pharmaceutical applications. Controlled precipitation was achieved injecting the organic CyA solution rapidly into an aqueous protective solution by means of a static mixer. Furthermore the computer based MSM was implemented for data analysis, visualization, and application development. For the optimization studies, the gelatin/lipoid S75 amounts and the organic/aqueous phase were selected as independent variables while the obtained particle size as a dependent variable. The optimum predicted formulation was characterized by cryo-TEM microscopy, particle size measurements, stability, and in vitro release. The produced nanoparticles contain drug in amorphous state and decreased amounts of stabilizing agents. The dissolution rate of the lyophilized powder was significantly enhanced in the first 2 h. MSM was proved capable to interpret in detail and to predict with high accuracy the optimum formulation. The mixer technique was proved capable to develop CyA nanoparticulate formulations.
Resumo:
The links between fuel poverty and poor health are well documented, yet there is no statutory requirement on local authorities to develop fuel poverty strategies, which tend to be patchy nationally and differ substantially in quality. Fuel poverty starts from the perspective of income, even though interventions can improve health. The current public health agenda calls for more partnership-based, cost-effective strategies based on sound evidence. Fuel poverty represents a key area where there is currently little local evidence quantifying and qualifying health gain arising from strategic interventions. As a result, this initial study sought to apply the principles of a health impact assessment to Luton’s Affordable Warmth Strategy, exploring the potential to identify health impact arising – as a baseline for future research – in the context of the public health agenda. A national strategy would help ensure the promotion of targeted fuel poverty strategies.
Resumo:
EXECUTIVE SUMMARY Aims 1. The aims of this strategy are • to ensure that a full range of education and training related to the adult end of life care pathway is available across South East London to meet the needs of our health and social care workforce • to enable those responsible for end of life care education and training commissioning to procure comprehensively from a full range of education providers in a systematic and strategic manner. Background 2. The work that underpins this strategy was begun by the South East London Cancer Network via its Palliative and End of Life Care Coordinating Group and then developed by way of the Marie Curie Delivering Choice Programme’s Education and Training work stream.
Resumo:
In the biological sciences, stereological techniques are frequently used to infer changes in structural parameters (volume fraction, for example) between samples from different populations or subject to differing treatment regimes. Non-homogeneity of these parameters is virtually guaranteed, both between experimental animals and within the organ under consideration. A two-stage strategy is then desirable, the first stage involving unbiased estimation of the required parameter, separately for each experimental unit, the latter being defined as a subset of the organ for which homogeneity can reasonably be assumed. In the second stage, these point estimates are used as data inputs to a hierarchical analysis of variance, to distinguish treatment effects from variability between animals, for example. Techniques are therefore required for unbiased estimation of parameters from potentially small numbers of sample profiles. This paper derives unbiased estimates of linear properties in one special case—the sampling of spherical particles by transmission microscopy, when the section thickness is not negligible and the resulting circular profiles are subject to lower truncation. The derivation uses the general integral equation formulation of Nicholson (1970); the resulting formulae are simplified, algebraically, and their efficient computation discussed. Bias arising from variability in slice thickness is shown to be negligible in typical cases. The strategy is illustrated for data examining the effects, on the secondary lysosomes in the digestive cells, of exposure of the common mussel to hydrocarbons. Prolonged exposure, at 30 μg 1−1 total oil-derived hydrocarbons, is seen to increase the average volume of a lysosome, and the volume fraction that lysosomes occupy, but to reduce their number.