894 resultados para Model-Based Design
Resumo:
Certification of an ISO 14001 Environmental Management System (EMS) is currently an important requirement for those enterprises wishing to sell their products in the context of a global market. The system`s structure is based on environmental impact evaluation (EIE). However, if an erroneous or inadequate methodology is applied, the entire process may be jeopardized. Many methodologies have been developed for making of EIEs, some of them are fairly complex and unsuitable for EMS implementation in an organizational context, principally when small and medium size enterprises (SMEs) are involved. The proposed methodology for EIE is part of a model for implementing EMS. The methodological approach used was a qualitative exploratory research method based upon sources of evidence such as document analyses, semi-structured interviews and participant observations. By adopting a cooperative implementation model based on the theory of system engineering, difficulties relating to implementation of the sub-system were overcome thus encouraging SMEs to implement EMS. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Test templates and a test template framework are introduced as useful concepts in specification-based testing. The framework can be defined using any model-based specification notation and used to derive tests from model-based specifications-in this paper, it is demonstrated using the Z notation. The framework formally defines test data sets and their relation to the operations in a specification and to other test data sets, providing structure to the testing process. Flexibility is preserved, so that many testing strategies can be used. Important application areas of the framework are discussed, including refinement of test data, regression testing, and test oracles.
Resumo:
Background: Brain injury is responsible for significant morbidity and mortality in trauma patients, but controversy still exists over optimal fluid management for these patients. This study aimed to investigate the effects of acute hemodilution with hydroxyethyl starch (HES) or lactated Ringer`s solution (LR) in intracranial pressure (ICP) and cerebral perfusion pressure (CPP) in dogs submitted to a cryogenic brain injury model. Methods: Design-Prospective laboratory animal study. Setting-Research laboratory in a teaching hospital. Subjects-Thirty-five male mongrel dogs. Interventions-Animals were enrolled to five groups: control, hemodilution with LR or HES 6% to an hematocrit target of 27% or 35%. Results: ICP and CPP levels were measured after cryogenic brain injury. Hemodilution promotes an increment of ICP levels, which decreases CPP when hematocrit target was estimated in 27.% after hemodilution. However, no differences were observed regarding crystalloid or colloid solution used for hemodilution in ICP and CPP levels. Conclusions: Hemodilution to a low hematocrit level increases ICP and decreases CPP scores in dogs submitted to a cryogenic brain injury. These results suggest that excessive hemodilution to a hematocrit below 30% should be avoided in traumatic brain injury patients.
Resumo:
The superior cervical ganglion (SCG) in mammals varies in structure according to developmental age, body size, gender, lateral asymmetry, the size and nuclear content of neurons and the complexity and synaptic coverage of their dendritic trees. In small and medium-sized mammals, neuron number and size increase from birth to adulthood and, in phylogenetic studies, vary with body size. However, recent studies on larger animals suggest that body weight does not, in general, accurately predict neuron number. We have applied design-based stereological tools at the light-microscopic level to assess the volumetric composition of ganglia and to estimate the numbers and sizes of neurons in SCGs from rats, capybaras and horses. Using transmission electron microscopy, we have obtained design-based estimates of the surface coverage of dendrites by postsynaptic apposition zones and model-based estimates of the numbers and sizes of synaptophysin-labelled axo-dendritic synaptic disks. Linear regression analysis of log-transformed data has been undertaken in order to establish the nature of the relationships between numbers and SCG volume (V(scg)). For SCGs (five per species), the allometric relationship for neuron number (N) is N=35,067xV (scg) (0.781) and that for synapses is N=20,095,000xV (scg) (1.328) , the former being a good predictor and the latter a poor predictor of synapse number. Our findings thus reveal the nature of SCG growth in terms of its main ingredients (neurons, neuropil, blood vessels) and show that larger mammals have SCG neurons exhibiting more complex arborizations and greater numbers of axo-dendritic synapses.
Resumo:
A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
A dynamic modelling methodology, which combines on-line variable estimation and parameter identification with physical laws to form an adaptive model for rotary sugar drying processes, is developed in this paper. In contrast to the conventional rate-based models using empirical transfer coefficients, the heat and mass transfer rates are estimated by using on-line measurements in the new model. Furthermore, a set of improved sectional solid transport equations with localized parameters is developed in this work to reidentified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.place the global correlation for the computation of solid retention time. Since a number of key model variables and parameters are identified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.
Briefing: Factored material properties and limit state loads-unlikely extreme or impossible pretense
Resumo:
In the limit state design (LSD) method each design criterion is formally stated and assessed using a performance function. The performance function defines the relationship between the design parameters and the design criterion. In practice, LSD involves factoring up loads and factoring down calculated strengths and material parameters. This provides a convenient way to carry out routine probabilistic-based design. The factors are statistically calculated to produce a design with an acceptably low probability of failure. Hence the ultimate load and the design material properties are mathematical concepts that have no physical interpretation. They may be physically impossible. Similarly, the appropriate analysis model is also defined by the performance function and may not describe the real behaviour at the perceived physical equivalent limit condition. These points must be understood to avoid confusion in the discussion and application of partial factor LSD methods.
Resumo:
We report the first steps of a collaborative project between the University of Queensland, Polyflow, Michelin, SK Chemicals, and RMIT University; on simulation, validation and application of a recently introduced constitutive model designed to describe branched polymers. Whereas much progress has been made on predicting the complex flow behaviour of many - in particular linear - polymers, it sometimes appears difficult to predict simultaneously shear thinning and extensional strain hardening behaviour using traditional constitutive models. Recently a new viscoelastic model based on molecular topology, was proposed by McLeish and Larson (1998). We explore the predictive power of a differential multi-mode version of the pom-pom model for the flow behaviour of two commercial polymer melts: a (long-chain branched) low-density polyethylene (LDPE) and a (linear) high-density polyethylene (HDPE). The model responses are compared to elongational recovery experiments published by Langouche and Debbaut (1999), and start-up of simple shear flow, stress relaxation after simple and reverse step strain experiments carried out in our laboratory.
Resumo:
Low concentrate density from wet drum magnetic separators in dense medium circuits can cause operating difficulties due to inability to obtain the required circulating medium density and, indirectly, high medium solids losses. The literature is almost silent on the processes controlling concentrate density. However, the common name for the region through which concentrate is discharged-the squeeze pan gap-implies that some extrusion process is thought to be at work. There is no model of magnetics recovery in a wet drum magnetic separator, which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was done using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 turn diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in this work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. It is proposed, based both on the experimental observations of the present work and on observations reported in the literature, that the process controlling magnetic separator concentrate density is one of drainage. Such a process should be able to be defined by an initial moisture, a drainage rate and a drainage time, the latter being defined by the volumetric flowrate and the volume within the drainage zone. The magnetics can be characterised by an experimentally derived ultimate drainage moisture. A model based on these concepts and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to be a good fit to data over concentrate solids content values from 40% solids to 80% solids and for both magnetite and ferrosilicon feeds. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
This paper presents a new model based on thermodynamic and molecular interaction between molecules to describe the vapour-liquid phase equilibria and surface tension of pure component. The model assumes that the bulk fluid can be characterised as set of parallel layers. Because of this molecular structure, we coin the model as the molecular layer structure theory (MLST). Each layer has two energetic components. One is the interaction energy of one molecule of that layer with all surrounding layers. The other component is the intra-layer Helmholtz free energy, which accounts for the internal energy and the entropy of that layer. The equilibrium between two separating phases is derived from the minimum of the grand potential, and the surface tension is calculated as the excess of the Helmholtz energy of the system. We test this model with a number of components, argon, krypton, ethane, n-butane, iso-butane, ethylene and sulphur hexafluoride, and the results are very satisfactory. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
This article recommends a new way to improve Refugee Status Determination (RSD) procedures by proposing a network society communicative model based on active involvement and dialogue among all implementing partners. This model, named after proposals from Castells, Habermas, Apel, Chimni, and Betts, would be mediated by the United Nations High Commissioner for Refugees (UNHCR), whose role would be modeled after that of the International Committee of the Red Cross (ICRC) practice.
Resumo:
This paper presents the development of a solar photovoltaic (PV) model based on PSCAD/EMTDC - Power System Computer Aided Design – including a mathematical model study. An additional algorithm has been implemented in MATLAB software in order to calculate several parameters required by the PSCAD developed model. All the simulation study has been performed in PSCAD/MATLAB software simulation tool. A real data base concerning irradiance, cell temperature and PV power generation was used in order to support the evaluation of the implemented PV model.
Resumo:
E-Learning frameworks are conceptual tools to organize networks of elearning services. Most frameworks cover areas that go beyond the scope of e-learning, from course to financial management, and neglects the typical activities in everyday life of teachers and students at schools such as the creation, delivery, resolution and evaluation of assignments. This paper presents the Ensemble framework - an e-learning framework exclusively focused on the teaching-learning process through the coordination of pedagogical services. The framework presents an abstract data, integration and evaluation model based on content and communications specifications. These specifications must base the implementation of networks in specialized domains with complex evaluations. In this paper we specialize the framework for two domains with complex evaluation: computer programming and computer-aided design (CAD). For each domain we highlight two Ensemble hotspots: data and evaluations procedures. In the former we formally describe the exercise and present possible extensions. In the latter, we describe the automatic evaluation procedures.
Resumo:
The participation of the Fraunhofer Institute for Manufacturing Engineering and Automation IPA (Stuttgart, Germany) and the companies User Interface Design GmbH (Ludwigsburg, Germany) plus MLR System GmbH (Ludwigsburg, Germany) enabled the research and findings presented in this paper; we would like to namely mention Birgit Graf and Theo Jacobs (Fraunhofer IPA) furthermore Peter Klein and Christiane Hartmann (User Interface Design GmbH).
Resumo:
Trabalho Final de Mestrado elaborado no Laboratório Nacional de Engenharia Civil (LNEC) para a obtenção do grau de Mestre em Engenharia Civil pelo Instituto Superior de Engenharia de Lisboa no âmbito do protocolo de cooperação entre o ISEL e o LNEC