973 resultados para DEFINITION
Resumo:
We analyze the irreversibility and the entropy production in nonequilibrium interacting particle systems described by a Fokker-Planck equation by the use of a suitable master equation representation. The irreversible character is provided either by nonconservative forces or by the contact with heat baths at distinct temperatures. The expression for the entropy production is deduced from a general definition, which is related to the probability of a trajectory in phase space and its time reversal, that makes no reference a priori to the dissipated power. Our formalism is applied to calculate the heat conductance in a simple system consisting of two Brownian particles each one in contact to a heat reservoir. We show also the connection between the definition of entropy production rate and the Jarzynski equality.
Resumo:
Cross sections of (120)Sn(alpha,alpha)(120)Sn elastic scattering have been extracted from the alpha-particle-beam contamination of a recent (120)Sn((6)He,(6)He)(120)Sn experiment. Both reactions are analyzed using systematic double-folding potentials in the real part and smoothly varying Woods-Saxon potentials in the imaginary part. The potential extracted from the (120)Sn((6)He,(6)He)(120)Sn data may be used as the basis for the construction of a simple global (6)He optical potential. The comparison of the (6)He and alpha data shows that the halo nature of the (6)He nucleus leads to a clear signature in the reflexion coefficients eta(L) : The relevant angular momenta L with eta(L) >> 0 and eta(L) << 1 are shifted to larger L with a broader distribution. This signature is not present in the alpha-scattering data and can thus be used as a new criterion for the definition of a halo nucleus.
Resumo:
One important issue implied by the finite nature of real-world networks regards the identification of their more external (border) and internal nodes. The present work proposes a formal and objective definition of these properties, founded on the recently introduced concept of node diversity. It is shown that this feature does not exhibit any relevant correlation with several well-established complex networks measurements. A methodology for the identification of the borders of complex networks is described and illustrated with respect to theoretical (geographical and knitted networks) as well as real-world networks (urban and word association networks), yielding interesting results and insights in both cases.
Resumo:
In the Hammersley-Aldous-Diaconis process, infinitely many particles sit in R and at most one particle is allowed at each position. A particle at x, whose nearest neighbor to the right is at y, jumps at rate y - x to a position uniformly distributed in the interval (x, y). The basic coupling between trajectories with different initial configuration induces a process with different classes of particles. We show that the invariant measures for the two-class process can be obtained as follows. First, a stationary M/M/1 queue is constructed as a function of two homogeneous Poisson processes, the arrivals with rate, and the (attempted) services with rate rho > lambda Then put first class particles at the instants of departures (effective services) and second class particles at the instants of unused services. The procedure is generalized for the n-class case by using n - 1 queues in tandem with n - 1 priority types of customers. A multi-line process is introduced; it consists of a coupling (different from Liggett's basic coupling), having as invariant measure the product of Poisson processes. The definition of the multi-line process involves the dual points of the space-time Poisson process used in the graphical construction of the reversed process. The coupled process is a transformation of the multi-line process and its invariant measure is the transformation described above of the product measure.
Resumo:
Background -: Beta-2 adrenergic receptor gene polymorphisms Gln27Glu, Arg16Gly and Thr164Ile were suggested to have an effect in heart failure. We evaluated these polymorphisms relative to clinical characteristics and prognosis of alarge cohort of patients with heart failure of different etiologies. Methods -: We studied 501 patients with heart failure of different etiologies. Mean age was 58 years (standard deviation 14.4 years), 298 (60%) were men. Polymorphisms were identified by polymerase chain reaction-restriction fragment length polymorphism. Results -: During the mean follow-up of 12.6 months (standard deviation 10.3 months), 188 (38%) patients died. Distribution of genotypes of polymorphism Arg16Gly was different relative to body mass index (chi(2) = 9.797; p = 0.04). Overall the probability of survival was not significantly predicted by genotypes of Gln27Glu, Arg16Gly, or Thr164Ile. Allele and haplotype analysis also did not disclose any significant difference regarding mortality. Exploratory analysis through classification trees pointed towards a potential association between the Gln27Glu polymorphism and mortality in older individuals. Conclusion -: In this study sample, we were not able to demonstrate an overall influence of polymorphisms Gln27Glu and Arg16Gly of beta-2 receptor gene on prognosis. Nevertheless, Gln27Glu polymorphism may have a potential predictive value in older individuals.
Resumo:
Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The Learning Object (OA) is any digital resource that can be reused to support learning with specific functions and objectives. The OA specifications are commonly offered in SCORM model without considering activities in groups. This deficiency was overcome by the solution presented in this paper. This work specified OA for e-learning activities in groups based on SCORM model. This solution allows the creation of dynamic objects which include content and software resources for the collaborative learning processes. That results in a generalization of the OA definition, and in a contribution with e-learning specifications.
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This paper contains a new proposal for the definition of the fundamental operation of query under the Adaptive Formalism, one capable of locating functional nuclei from descriptions of their semantics. To demonstrate the method`s applicability, an implementation of the query procedure constrained to a specific class of devices is shown, and its asymptotic computational complexity is discussed.
Resumo:
This paper addresses the use of optimization techniques in the design of a steel riser. Two methods are used: the genetic algorithm, which imitates the process of natural selection, and the simulated annealing, which is based on the process of annealing of a metal. Both of them are capable of searching a given solution space for the best feasible riser configuration according to predefined criteria. Optimization issues are discussed, such as problem codification, parameter selection, definition of objective function, and restrictions. A comparison between the results obtained for economic and structural objective functions is made for a case study. Optimization method parallelization is also addressed. [DOI: 10.1115/1.4001955]
Resumo:
The common practice of reconciliation is based on definition of the mine call factor (MCF) and its application to resource or grade control estimates. The MCF expresses the difference, a ratio or percentage, between the predicted grade and the grade reported by the plant. Therefore, its application allows to correct future estimates. This practice is named reactive reconciliation. However the use of generic factors that are applied across differing time scales and material types often disguises the causes of the error responsible for the discrepancy. The root causes of any given variance can only be identified by analyzing the information behind any variance and, then, making changes to methodologies and processes. This practice is named prognostication, or proactive reconciliation, an iterative process resulting in constant recalibration of the inputs and the calculations. The prognostication allows personnel to adjust processes so that results align within acceptable tolerance ranges, and not only to correct model estimates. This study analyses the reconciliation practices performed at a gold mine in Brazil and suggests a new sampling protocol, based on prognostication concepts.
Resumo:
This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular bi-dimensional items inside a bi-dimensional container. This problem is approached with a heuristic based on Simulated Annealing (SA) with adaptive neighborhood. The objective function is evaluated in a constructive approach, where the items are placed sequentially. The placement is governed by three different types of parameters: sequence of placement, the rotation angle and the translation. The rotation applied and the translation of the polygon are cyclic continuous parameters, and the sequence of placement defines a combinatorial problem. This way, it is necessary to control cyclic continuous and discrete parameters. The approaches described in the literature deal with only type of parameter (sequence of placement or translation). In the proposed SA algorithm, the sensibility of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensibility of each parameter is associated to its probability distribution in the definition of the next candidate.
Resumo:
The paper discusses the effect of stress triaxiality on the onset and evolution of damage in ductile metals. A series of tests including shear tests and experiments oil smooth and pre-notched tension specimens wits carried Out for it wide range of stress triaxialities. The underlying continuum damage model is based oil kinematic definition of damage tensors. The modular structure of the approach is accomplished by the decomposition of strain rates into elastic, plastic and damage parts. Free energy functions with respect to fictitious undamaged configurations as well as damaged ones are introduced separately leading to elastic material laws which are affected by increasing damage. In addition, a macroscopic yield condition and a flow rule are used to adequately describe the plastic behavior. Numerical simulations of the experiments are performed and good correlation of tests and numerical results is achieved. Based oil experimental and numerical data the damage criterion formulated in stress space is quantified. Different branches of this function are taken into account corresponding to different damage modes depending oil stress triaxiality and Lode parameter. In addition, identification of material parameters is discussed ill detail. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The practicability of estimating directional wave spectra based on a vessel`s 1st order response has been recently addressed by several researchers. Different alternatives regarding statistical inference methods and possible drawbacks that could arise from their application have been extensively discussed, with an apparent preference for estimations based on Bayesian inference algorithms. Most of the results on this matter, however, rely exclusively on numerical simulations or at best on few and sparse full-scale measurements, comprising a questionable basis for validation purposes. This paper discusses several issues that have recently been debated regarding the advantages of Bayesian inference and different alternatives for its implementation. Among those are the definition of the best set of input motions, the number of parameters required for guaranteeing smoothness of the spectrum in frequency and direction and how to determine their optimum values. These subjects are addressed in the light of an extensive experimental campaign performed with a small-scale model of an FPSO platform (VLCC hull), which was conducted in an ocean basin in Brazil. Tests involved long and short crested seas with variable levels of directional spreading and also bimodal conditions. The calibration spectra measured in the tank by means of an array of wave probes configured the paradigm for estimations. Results showed that a wide range of sea conditions could be estimated with good precision, even those with somewhat low peak periods. Some possible drawbacks that have been pointed out in previous works concerning the viability of employing large vessels for such a task are then refuted. Also, it is shown that a second parameter for smoothing the spectrum in frequency may indeed increase the accuracy in some situations, although the criterion usually proposed for estimating the optimum values (ABIC) demands large computational effort and does not seem adequate for practical on-board systems, which require expeditious estimations. (C) 2009 Elsevier Ltd. All rights reserved.