86 resultados para Interactional and collaborative process of knowledge construction
Resumo:
This study investigates the influence of process parameters on the fluidised hot melt granulation of lactose and PEG 6000, and the subsequent tablet pressing of the granules. Granulation experiments were performed to assess the effect of granulation time and binder content of the feed on the resulting granule properties such as mass mean granule size, size distribution, granule fracture stress, and granule porosity. These data were correlated using the granule growth regime model. It was found that the dominant granule growth mechanisms in this melt granulation system were nucleation followed by steady growth (PEG 10–20% w/w). However, with binder contents greater than 20% w/w, the granulation mechanism moved to the “over-wet massing” regime in which discrete granule formation could not be obtained. The granules produced in the melt fluidised bed process were subsequently pressed into tablets using an industrial tablet press. The physical properties of the tablets: fracture stress, disintegration time and friability were assessed using industry standards. These analyses indicated that particle size and binder content of the initial granules influenced the mechanical properties of the tablets. It was noted that a decrease in initial granule size resulted in an increase in the fracture stress of the tablets formed.
Resumo:
This paper evaluates how long-term records could and should be utilized in conservation policy and practice. Traditionally, there has been an extremely limited use of long-term ecological records (greater than 50 years) in biodiversity conservation. There are a number of reasons why such records tend to be discounted, including a perception of poor scale of resolution in both time and space, and the lack of accessibility of long temporal records to non-specialists. Probably more important, however, is the perception that even if suitable temporal records are available, their roles are purely descriptive, simply demonstrating what has occurred before in Earth’s history, and are of little use in the actual practice of conservation. This paper asks why this is the case and whether there is a place for the temporal record in conservation management. Key conservation initiatives related to extinctions, identification of regions of greatest diversity/threat, climate change and biological invasions are addressed. Examples of how a temporal record can add information that is of direct practicable applicability to these issues are highlighted. These include (i) the identification of species at the end of their evolutionary lifespan and therefore most at risk from extinction, (ii) the setting of realistic goals and targets for conservation ‘hotspots’, and (iii) the identification of various management tools for the maintenance/restoration of a desired biological state. For climate change conservation strategies, the use of long-term ecological records in testing the predictive power of species envelope models is highlighted, along with the potential of fossil records to examine the impact of sea-level rise. It is also argued that a long-term perspective is essential for the management of biological invasions, not least in determining when an invasive is not an invasive. The paper concludes that often inclusion of a long-term ecological perspective can provide a more scientifically defensible basis for conservation decisions than the one based only on contemporary records. The pivotal issue of this paper is not whether long-term records are of interest to conservation biologists, but how they can actually be utilized in conservation practice and policy.
Resumo:
The divide-and-conquer approach of local model (LM) networks is a common engineering approach to the identification of a complex nonlinear dynamical system. The global representation is obtained from the weighted sum of locally valid, simpler sub-models defined over small regions of the operating space. Constructing such networks requires the determination of appropriate partitioning and the parameters of the LMs. This paper focuses on the structural aspect of LM networks. It compares the computational requirements and performances of the Johansen and Foss (J&F) and LOLIMOT tree-construction algorithms. Several useful and important modifications to each algorithm are proposed. The modelling performances are evaluated using real data from a pilot plant of a pH neutralization process. Results show that while J&F achieves a more accurate nonlinear representation of the pH process, LOLIMOT requires significantly less computational effort.
Resumo:
This paper engages with contemporary discussions in relation to the commodification of policing and security. It suggests that the existing literature regarding these trends has been geared primarily towards commercial security providers and has failed to address the processes by which public policing models are commodified and marketed both within, and through, the transnational policing community. Drawing upon evidence from the police change process in Northern Ireland, we argue that a Northern Irish Policing Model (NIPM) has emerged in the aftermath of the Independent Commission on Policing (ICP) reforms. This is increasingly branded and promoted on the global stage. Furthermore, we suggest that the NIPM is not monolithic, but segmented, and targeted towards a number of different 'consumers' both domestically and transnationally. Reflecting these diverse markets, the NIPM draws upon two seemingly incongruous constituent elements: the 'best practice' lessons of policing transition, as embodied in the ICP reforms; and, the legacy of counter-terrorism expertise drawn from the preceding decades of conflict. The discussion concludes by querying as to which of these components of the NIPM is in the ascendancy.
Resumo:
Self-compacting concrete (SCC) flows into place and around obstructions under its own weight to fill the formwork completely and self-compact without any segregation and blocking. Elimination of the need for compaction leads to better quality concrete and substantial improvement of working conditions. This investigation aimed to show possible applicability of genetic programming (GP) to model and formulate the fresh and hardened properties of self-compacting concrete (SCC) containing pulverised fuel ash (PFA) based on experimental data. Twenty-six mixes were made with 0.38 to 0.72 water-to-binder ratio (W/B), 183–317 kg/m3 of cement content, 29–261 kg/m3 of PFA, and 0 to 1% of superplasticizer, by mass of powder. Parameters of SCC mixes modelled by genetic programming were the slump flow, JRing combined to the Orimet, JRing combined to cone, and the compressive strength at 7, 28 and 90 days. GP is constructed of training and testing data using the experimental results obtained in this study. The results of genetic programming models are compared with experimental results and are found to be quite accurate. GP has showed a strong potential as a feasible tool for modelling the fresh properties and the compressive strength of SCC containing PFA and produced analytical prediction of these properties as a function as the mix ingredients. Results showed that the GP model thus developed is not only capable of accurately predicting the slump flow, JRing combined to the Orimet, JRing combined to cone, and the compressive strength used in the training process, but it can also effectively predict the above properties for new mixes designed within the practical range with the variation of mix ingredients.