862 resultados para Integrated user model
Resumo:
The cost and risk associated with mineral exploration in Australia increases significantly as companies move into deeper regolith-covered terrain. The ability to map the bedrock and the depth of weathering within an area has the potential to decrease this risk and increase the effectiveness of exploration programs. This paper is the second in a trilogy concerning the Grant's Patch area of the Eastern Goldfields. The recent development of the VPmg potential field inversion program in conjunction with the acquisition of high-resolution gravity data over an area with extensive drilling provided an opportunity to evaluate three-dimensional gravity inversion as a bedrock and regolith mapping tool. An apparent density model of the study area was constructed, with the ground represented as adjoining 200 m by 200 m vertical rectangular prisms. During inversion VPmg incrementally adjusted the density of each prism until the free-air gravity response of the model replicated the observed data. For the Grant's Patch study area, this image of the apparent density values proved easier to interpret than the Bouguer gravity image. A regolith layer was introduced into the model and realistic fresh-rock densities assigned to each basement prism according to its interpreted lithology. With the basement and regolith densities fixed, the VPmg inversion algorithm adjusted the depth to fresh basement until the misfit between the calculated and observed gravity response was minimised. The resulting geometry of the bedrock/regolith contact largely replicated the base of weathering indicated by drilling with predicted depth of weathering values from gravity inversion typically within 15% of those logged during RAB and RC drilling.
Resumo:
The Brisbane River and Moreton Bay Study, an interdisciplinary study of Moreton Bay and its major tributaries, was initiated to address water quality issues which link sewage and diffuse loading with environmental degradation. Runoff and deposition of fine-grained sediments into Moreton Bay, followed by resuspension, have been linked with increased turbidity and significant loss of seagrass habitat. Sewage-derived nutrient enrichment, particularly nitrogen (N), has been linked to algal blooms by sewage plume maps. Blooms of a marine cyanobacterium, Lyngbya majuscula, in Moreton Bay have resulted in significant impacts on human health (e.g., contact dermatitis) and ecological health (e.g., seagrass loss), and the availability of dissolved iron from acid sulfate soil runoff has been hypothesised. The impacts of catchment activities resulting in runoff of sediments, nutrients and dissolved iron on the health of the Moreton Bay waterways are addressed. The Study, established by 6 local councils in association with two state departments in 1994, forms a regional component of a national and state program to achieve ecologically sustainable use of the waterways by protecting and enhancing their health, while maintaining economic and social development. The Study framework illustrates a unique integrated approach to water quality management whereby scientific research, community participation and the strategy development were done in parallel with each other. This collaborative effort resulted in a water quality management strategy which focuses on the integration of socioeconomic and ecological values of the waterways. This work has led to significant cost savings in infrastructure by providing a clear focus on initiatives towards achieving healthy waterways. The Study's Stage 2 initiatives form the basis for this paper.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
The majority of the world's population now resides in urban environments and information on the internal composition and dynamics of these environments is essential to enable preservation of certain standards of living. Remotely sensed data, especially the global coverage of moderate spatial resolution satellites such as Landsat, Indian Resource Satellite and Systeme Pour I'Observation de la Terre (SPOT), offer a highly useful data source for mapping the composition of these cities and examining their changes over time. The utility and range of applications for remotely sensed data in urban environments could be improved with a more appropriate conceptual model relating urban environments to the sampling resolutions of imaging sensors and processing routines. Hence, the aim of this work was to take the Vegetation-Impervious surface-Soil (VIS) model of urban composition and match it with the most appropriate image processing methodology to deliver information on VIS composition for urban environments. Several approaches were evaluated for mapping the urban composition of Brisbane city (south-cast Queensland, Australia) using Landsat 5 Thematic Mapper data and 1:5000 aerial photographs. The methods evaluated were: image classification; interpretation of aerial photographs; and constrained linear mixture analysis. Over 900 reference sample points on four transects were extracted from the aerial photographs and used as a basis to check output of the classification and mixture analysis. Distinctive zonations of VIS related to urban composition were found in the per-pixel classification and aggregated air-photo interpretation; however, significant spectral confusion also resulted between classes. In contrast, the VIS fraction images produced from the mixture analysis enabled distinctive densities of commercial, industrial and residential zones within the city to be clearly defined, based on their relative amount of vegetation cover. The soil fraction image served as an index for areas being (re)developed. The logical match of a low (L)-resolution, spectral mixture analysis approach with the moderate spatial resolution image data, ensured the processing model matched the spectrally heterogeneous nature of the urban environments at the scale of Landsat Thematic Mapper data.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
In the previous two papers in this three-part series, we have examined visual pigments, ocular media transmission, and colors of the coral reef fish of Hawaii. This paper first details aspects of the light field and background colors at the microhabitat level on Hawaiian reefs and does so from the perspective and scale of fish living on the reef. Second, information from all three papers is combined in an attempt to examine trends in the visual ecology of reef inhabitants. Our goal is to begin to see fish the way they appear to other fish. Observations resulting from the combination of results in all three papers include the following. Yellow and blue colors on their own are strikingly well matched to backgrounds on the reef such as coral and bodies of horizontally viewed water. These colors, therefore, depending on context, may be important in camouflage as well as conspicuousness. The spectral characteristics of fish colors are correlated to the known spectral sensitivities in reef fish single cones and are tuned for maximum signal reliability when viewed against known backgrounds. The optimal positions of spectral sensitivity in a modeled dichromatic visual system are generally close to the sensitivities known for reef fish. Models also predict that both UV-sensitive and red-sensitive cone types are advantageous for a variety of tasks. UV-sensitive cones are known in some reef fish, red-sensitive cones have yet to be found. Labroid colors, which appear green or blue to us, may he matched to the far-red component of chlorophyll reflectance for camouflage. Red cave/hole dwelling reef fish are relatively poorly matched to the background they are often viewed against but this may be visually irrelevant. The model predicts that the task of distinguishing green algae from coral is optimized with a relatively long wavelength visual pigment pair. Herbivorous grazers whose visual pigments are known possess the longest sensitivities so far found. Labroid complex colors are highly contrasting complementary colors close up but combine, because of the spatial addition, which results from low visual resolution, at distance, to match background water colors remarkably well. Therefore, they are effective for simultaneous communication and camouflage.
Resumo:
This paper describes a coupled knowledge-based system (KBS) for the design of liquid-retaining structures, which can handle both the symbolic knowledge processing based on engineering heuristics in the preliminary synthesis stage and the extensive numerical crunching involved in the detailed analysis stage. The prototype system is developed by employing blackboard architecture and a commercial shell VISUAL RULE STUDIO. Its present scope covers design of three types of liquid-retaining structures, namely, a rectangular shape with one compartment, a rectangular shape with two compartments and a circular shape. Through custom-built interactive graphical user interfaces, the user is directed throughout the design process, which includes preliminary design, load specification, model generation, finite element analysis, code compliance checking and member sizing optimization. It is also integrated with various relational databases that provide the system with sectional properties, moment and shear coefficients and final member details. This system can act as a consultant to assist novice designers in the design of liquid-retaining structures with increase in efficiency and optimization of design output and automated record keeping. The design of a typical example of the liquid-retaining structure is also illustrated. (C) 2003 Elsevier B.V All rights reserved.
Resumo:
Glycogen-accumulating organisms (GAO) have the potential to directly compete with polyphosphate-accumulating organisms (PAO) in EBPR systems as both are able to take up VFA anaerobically and grow on the intracellular storage products aerobically. Under anaerobic conditions GAO hydrolyse glycogen to gain energy and reducing equivalents to take up VFA and to synthesise polyhydroxyalkanoate (PHA). In the subsequent aerobic stage, PHA is being oxidised to gain energy for glycogen replenishment (from PHA) and for cell growth. This article describes a complete anaerobic and aerobic model for GAO based on the understanding of their metabolic pathways. The anaerobic model has been developed and reported previously, while the aerobic metabolic model was developed in this study. It is based on the assumption that acetyl-CoA and propionyl-CoA go through the catabolic and anabolic processes independently. Experimental validation shows that the integrated model can predict the anaerobic and aerobic results very well. It was found in this study that at pH 7 the maximum acetate uptake rate of GAO was slower than that reported for PAO in the anaerobic stage. On the other hand, the net biomass production per C-mol acetate added is about 9% higher for GAO than for PAO. This would indicate that PAO and GAO each have certain competitive advantages during different parts of the anaerobic/aerobic process cycle. (C) 2002 Wiley Periodicals, Inc.
Resumo:
In the context of an e ort to develop methodologies to support the evaluation of interactive system, this paper investigates an approach to detect graphical user interface bad smells. Our approach consists in detecting user interface bad smells through model-based reverse engineering from source code. Models are used to de ne which widgets are present in the interface, when can particular graphical user interface (GUI) events occur, under which conditions, which system actions are executed, and which GUI state is generated next.
Resumo:
Purpose – The purpose of this paper is to propose a generic model of Integrated Management System of Quality, Environment and Safety (IMS-QES) that can be adapted and progressively to assimilate various Management Systems, of which highlights: ISO 9001 for Quality; ISO 14001 for Environment; OHSAS 18001 for Occupational Health and Safety. Design/methodology/approach – The model was designed in the real environment of a Portuguese Organization and 160 employees were surveyed. The rate response was equal to 86 percent. The conceived model was implemented in a first phase for the integration of Quality, Environment and Safety Management Systems. Findings – Among the main findings of the survey the paper highlights: the elimination of conflicts between individual systems with resources optimization; creation of added value to the business by eliminating several types of wastes; the integrated management of sustainability components in a global market; the improvement of partnerships with suppliers of goods and services; reducing the number of internal and external audits. Originality/value – This case study is one of the first Portuguese empirical researches about IMS-QES and the paper believes that it can be useful in the creation of a Portuguese guideline for integration, namely the Quality Management Systems; Environmental Management Systems and Occupational Health and Safety Management Systems among others.
Resumo:
The real Cloud and Ubiquitous Manufacturing systems require effectiveness and permanent availability of resources, their capacity and scalability. One of the most important problems for applications management over cloud based platforms, which are expected to support efficient scalability and resources coordination following SaaS implementation model, is their interoperability. Even application dashboards need to easily incorporate those new applications, their interoperability still remains a big problem to override. So, the possibility to expand these dashboards with efficiently integrated communicational cloud based services (cloudlets) represents a relevant added value as well as contributes to solving the interoperability problem. Following the architecture for integration of enriched existing cloud services, as instances of manufacturing resources, this paper: a) proposes a cloud based web platform to support dashboard integrating communicational services, and b) describe an experimentation to sustain the theory that the effective and efficient interoperability, especially in dynamic environments, could be achieved only with human intervention.
Resumo:
In this paper a realistic directional channel model that is an extension of the COST 273 channel model is presented. The model uses a cluster of scatterers and visibility region generation based strategy with increased realism, due to the introduction of terrain and clutter information. New approaches for path-loss prediction and line of sight modeling are considered, affecting the cluster path gain model implementation. The new model was implemented using terrain, clutter, street and user mobility information for the city of Lisbon, Portugal. Some of the model's outputs are presented, mainly path loss and small/large-scale fading statistics.
Resumo:
This paper presents new integrated model for variable-speed wind energy conversion systems, considering a more accurate dynamic of the wind turbine, rotor, generator, power converter and filter. Pulse width modulation by space vector modulation associated with sliding mode is used for controlling the power converters. Also, power factor control is introduced at the output of the power converters. Comprehensive performance simulation studies are carried out with matrix, two-level and multilevel power converter topologies in order to adequately assert the system performance. Conclusions are duly drawn.
Resumo:
It is proposed a new approach based on a methodology, assisted by a tool, to create new products in the automobile industry based on previous defined processes and experiences inspired on a set of best practices or principles: it is based on high-level models or specifications; it is component-based architecture centric; it is based on generative programming techniques. This approach follows in essence the MDA (Model Driven Architecture) philosophy with some specific characteristics. We propose a repository that keeps related information, such as models, applications, design information, generated artifacts and even information concerning the development process itself (e.g., generation steps, tests and integration milestones). Generically, this methodology receives the users' requirements to a new product (e.g., functional, non-functional, product specification) as its main inputs and produces a set of artifacts (e.g., design parts, process validation output) as its main output, that will be integrated in the engineer design tool (e.g. CAD system) facilitating the work.
Resumo:
This paper seeks to study the persistence in the G7’s stock market volatility, which is carried out using the GARCH, IGARCH and FIGARCH models. The data set consists of the daily returns of the S&P/TSX 60, CAC 40, DAX 30, MIB 30, NIKKEI 225, FTSE 100 and S&P 500 indexes over the period 1999-2009. The results evidences long memory in volatility, which is more pronounced in Germany, Italy and France. On the other hand, Japan appears as the country where this phenomenon is less obvious; nevertheless, the persistence prevails but with minor intensity.