873 resultados para Requirements engineering process
Resumo:
The biological reactions during the settling and decant periods of Sequencing Batch Reactors (SBRs) are generally ignored as they are not easily measured or described by modelling approaches. However, important processes are taking place, and in particular when the influent is fed into the bottom of the reactor at the same time (one of the main features of the UniFed process), the inclusion of these stages is crucial for accurate process predictions. Due to the vertical stratification of both liquid and solid components, a one-dimensional hydraulic model is combined with a modified ASM2d biological model to allow the prediction of settling velocity, sludge concentration, soluble components and biological processes during the non-mixed periods of the SBR. The model is calibrated on a full-scale UniFed SBR system with tracer breakthrough tests, depth profiles of particulate and soluble compounds and measurements of the key components during the mixed aerobic period. This model is then validated against results from an independent experimental period with considerably different operating parameters. In both cases, the model is able to accurately predict the stratification and most of the biological reactions occurring in the sludge blanket and the supernatant during the non-mixed periods. Together with a correct description of the mixed aerobic period, a good prediction of the overall SBR performance can be achieved.
Resumo:
Accurate habitat mapping is critical to landscape ecological studies such as required for developing and testing Montreal Process indicator 1.1e, fragmentation of forest types. This task poses a major challenge to remote sensing, especially in mixedspecies, variable-age forests such as dry eucalypt forests of subtropical eastern Australia. In this paper, we apply an innovative approach that uses a small section of one-metre resolution airborne data to calibrate a moderate spatial resolution model (30 m resolution; scale 1:50 000) based on Landsat Thematic Mapper data to estimate canopy structural properties in St Marys State Forest, near Maryborough, south-eastern Queensland. The approach applies an image-processing model that assumes each image pixel is significantly larger than individual tree crowns and gaps to estimate crown-cover percentage, stem density and mean crown diameter. These parameters were classified into three discrete habitat classes to match the ecology of four exudivorous arboreal species (yellowbellied glider Petaurus australis, sugar glider P. breviceps, squirrel glider P. norfolcensis , and feathertail glider Acrobates pygmaeus), and one folivorous arboreal marsupial, the greater glider Petauroides volans. These species were targeted due to the known ecological preference for old trees with hollows, and differences in their home range requirements. The overall mapping accuracy, visually assessed against transects (n = 93) interpreted from a digital orthophoto and validated in the field, was 79% (KHAT statistic = 0.72). The KHAT statistic serves as an indicator of the extent that the percentage correct values of the error matrix are due to ‘true’ agreement verses ‘chance’ agreement. This means that we are able to reliably report on the effect of habitat loss on target species, especially those with a large home range size (e.g. yellow-bellied glider). However, the classified habitat map failed to accurately capture the spatial patterning (e.g. patch size and shape) of stands with a trace or sub-dominance of senescent trees. This outcome makes the reporting of the effects of habitat fragmentation more problematic, especially for species with a small home range size (e.g. feathertail glider). With further model refinement and validation, however, this moderateresolution approach offers an important, cost eff e c t i v e advancement in mapping the age of dry eucalypt forests in the region.
Resumo:
In this paper we use sensor-annotated abstraction hierarchies (Reising & Sanderson, 1996, 2002a,b) to show that unless appropriately instrumented, configural displays designed according to the principles of ecological interface design (EID) might be vulnerable to misinterpretation when sensors become unreliable or are unavailable. Building on foundations established in Reising and Sanderson (2002a) we use a pasteurization process control example to show how sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on a configural display that is part of an ecological interface. Our analyses suggest that configural displays showing higher-order properties of a system are especially vulnerable under some conservative instrumentation configurations. However, sensor-annotated AHs can be used to indicate where corrective instrumentation might be placed. We argue that if EID is to be effectively employed in the design of displays for complex systems, then the information needs of the human operator need to be considered while instrumentation requirements are being formulated. Rasmussen's abstraction hierarchy-and particularly its extension to the analysis of information captured by sensors and derived from sensors-may therefore be a useful adjunct to up-stream instrumentation design. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The mechanism underlying segregation in liquid fluidized beds is investigated in this paper, A binary fluidized bed system not at a stable equilibrium condition. is modelled in the literature as forming a mixed part-corresponding to stable mixture-at the bottom of the bed and a pure layer of excess components always floating on the mixed part. On the basis of this model: (0 comprehensive criteria for binary particles of any type to mix/segregate, and (ii) mixing, segregation regime map in terms of size ratio and density ratio of the particles for a given fluidizing medium, are established in this work. Therefore, knowing the properties of given particles, a second type of particles can be chosen in order to avoid or to promote segregation according to the particular process requirements. The model is then advanced for multicomponent fluidized beds and validated against experimental results observed for ternary fluidized beds. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The Oscillatory baffled reactor (OBR) can be used to produce particles with controlled size and morphology, in batch or continuous flow. This is due to the effect of the superimposed oscillations that radially mixes fluid but still allows plug-flow (or close to plug flow) behaviour in a continuous system. This mixing, combined with a close to a constant level of turbulence intensity in the reactor, leads to tight droplet and subsequent product particle size distributions. By applying population balance equations together with experimental droplet size distributions, breakage rates of droplets can be determined and this is a useful tool for understanding the product engineering in OBRs. (C) 2002 Elsevier Science B.V All rights reserved.
Resumo:
A technique based on laser light diffraction is shown to be successful in collecting on-line experimental data. Time series of floc size distributions (FSD) under different shear rates (G) and calcium additions were collected. The steady state mass mean diameter decreased with increasing shear rate G and increased when calcium additions exceeded 8 mg/l. A so-called population balance model (PBM) was used to describe the experimental data, This kind of model describes both aggregation and breakage through birth and death terms. A discretised PBM was used since analytical solutions of the integro-partial differential equations are non-existing. Despite the complexity of the model, only 2 parameters need to be estimated: the aggregation rate and the breakage rate. The model seems, however, to lack flexibility. Also, the description of the floc size distribution (FSD) in time is not accurate.
Resumo:
Complete biological nutrient removal (BNR) in a single tank, sequencing batch reactor (SBR) process, is demonstrated here at full-scale on a typical domestic wastewater. The unique feature of the UniFed process is the introduction of the influent into the settled sludge blanket during the settling and decant periods of the SBR operation. This achieves suitable conditions for denitrification and anaerobic phosphate release which is critical to successful biological phosphorus removal, It also achieves a selector effect, which helps in generating a compact, well settling biomass in the reactor. The results of this demonstration show that it is possible to achieve well over 90% removal of GOD, nitrogen and phosphorus in such a process. Effluent quality achieved over a six-month operating period directly after commissioning was: 29 mg/l GOD, 0.5 mg/l NH4-N, 1.5 mg/l NOx-N and 1.5 mg/l PO4-P (50%-iles of daily samples). During an 8-day, intensive sampling period, the effluent BOD5 was
Resumo:
BP Refinery (Bulwer Island) Ltd (BP) located on the eastern Australian coast is currently undergoing a major expansion as a part of the Queensland Clean Fuels Project. The associated wastewater treatment plant upgrade will provide a better quality of treated effluent than is currently possible with the existing infrastructure, and which will be of a sufficiently high standard to meet not only the requirements of imposed environmental legislation but also BP's environmental objectives. A number of challenges were faced when considering the upgrade, particularly; cost constraints and limited plot space, highly variable wastewater, toxicity issues, and limited available hydraulic head. Sequencing Batch Reactor (SBR) Technology was chosen for the lagoon upgrade based on the following; SBR technology allowed a retro-fit of the existing earthen lagoon without the need for any additional substantial concrete structures, a dual lagoon system allowed partial treatment of wastewaters during construction, SBRs give substantial process flexibility, SBRs have the ability to easily modify process parameters without any physical modifications, and significant cost benefits. This paper presents the background to this application, an outline of laboratory studies carried out on the wastewater and details the full scale design issues and methods for providing a cost effective, efficient treatment system using the existing lagoon system.
Resumo:
A growing demand for efficient air quality management calls for the development of technologies capable of meeting the stringent requirements now being applied in areas of chemical, biological and medical activities. Currently, filtration is the most effective process available for removal of fine particles from carrier gases. Purification of gaseous pollutants is associated with adsorption, absorption and incineration. In this paper we discuss a new technique for highly efficient simultaneous purification of gaseous and particulate pollutants from carrier gases, and investigate the utilization of Nuclear Magnetic Resonance (NMR) imaging for the study of the dynamic processes associated with gas-liquid flow in porous media. Our technique involves the passage of contaminated carrier gases through a porous medium submerged into a liquid, leading to the formation of narrow and tortuous pathways through the medium. The wet walls of these pathways result in outstanding purification of gaseous, liquid and solid alien additives. NMR imaging was successfully used to map the gas pathways inside the porous medium submerged into the liquid layer. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Abstract: The aim of this study was to characterize the trajectory of answerability in Brazil. In the light of studies based on the historical neo-institutionalism approach, formal institutional changes adopted at federal level between 1985 and 2014, and which favor the typical requirements of answerability - information and justification - were identified and analyzed through the content analysis technique. The conclusion is that the trajectory of answerability in contemporary Brazil can be characterized as continuous, primarily occurring through the layering strategy, and whose leitmotif, since its origin, has consisted of matters of financial and budgetary nature. Nevertheless, a recent influence of deeper democratic subjects on it has been observed.
Resumo:
Nowadays, the Portuguese insurance industry operates in a market with a much more aggressive structure than a few decades ago. Markets and the economy have become globalised since the last decade of the 20th century. Market forces have gradually shifted – power is now mainly on the demand side. In order to meet the new requirements, the insurance industry must develop a strong strategic ability to respond to constant changes of the new international economic order.One of the basic aspects of this strategic development will focus on the ability to predict the future. We introduce the subject by briefly describing the sector, its organisational structure in the Portuguese market, and challenges arising from the development of the European Union. We then analyse the economic and financial structure of the sector. From this point of view, we aim at the possibility of designing models that could explain the demand for insurance, claims and technical reserves evolution. Such models, (even if based on the past), would resolve, at least partly, one of the greatest difficulties experienced by insurance companies when estimating the budget. Thus, we examine the existence of variables that explain the previous points, which are capable of forming a basis for designing models that are simple but efficient, and can be used for strategic planning.
Resumo:
In this paper we present VERITAS, a tool that focus time maintenance, that is one of the most important processes in the engineering of the time during the development of KBS. The verification and validation (V&V) process is part of a wider process denominated knowledge maintenance, in which an enterprise systematically gathers, organizes, shares, and analyzes knowledge to accomplish its goals and mission. The V&V process states if the software requirements specifications have been correctly and completely fulfilled. The methodologies proposed in software engineering have showed to be inadequate for Knowledge Based Systems (KBS) validation and verification, since KBS present some particular characteristics. VERITAS is an automatic tool developed for KBS verification which is able to detect a large number of knowledge anomalies. It addresses many relevant aspects considered in real applications, like the usage of rule triggering selection mechanisms and temporal reasoning.