94 resultados para Engineering, Industrial|Engineering, System Science|Operations Research
Resumo:
Reusable and evolvable Software Engineering Environments (SEES) are essential to software production and have increasingly become a need. In another perspective, software architectures and reference architectures have played a significant role in determining the success of software systems. In this paper we present a reference architecture for SEEs, named RefASSET, which is based on concepts coming from the aspect-oriented approach. This architecture is specialized to the software testing domain and the development of tools for that domain is discussed. This and other case studies have pointed out that the use of aspects in RefASSET provides a better Separation of Concerns, resulting in reusable and evolvable SEEs. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Ubiquitous computing aims at providing services to users in everyday environments such as the home. One research theme in this area is that of building capture and access applications which support information to be recorded ( captured) during a live experience toward automatically producing documents for review (accessed). The recording demands instrumented environments with devices such as microphones, cameras, sensors and electronic whiteboards. Since each experience is usually related to many others ( e. g. several meetings of a project), there is a demand for mechanisms supporting the automatic linking among documents relative to different experiences. In this paper we present original results relative to the integration of our previous efforts in the Infrastructure for Capturing, Accessing, Linking, Storing and Presenting information (CALiSP). Ubiquitous computing aims at providing services to users in everyday environments such as the home. One research theme in this area is that of building capture and access applications which support information to be recorded (captured) during a live experience toward automatically producing documents for review (accessed). The recording demands instrumented environments with devices such as microphones, cameras, sensors and electronic whiteboards. Since each experience is usually related to many others (e.g. several meetings of a project), there is a demand for mechanisms supporting the automatic linking among documents relative to different experiences. In this paper we present original results relative to the integration of our previous efforts in the Infrastructure for Capturing, Accessing, Linking, Storing and Presenting information (CALiSP).
Resumo:
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the epsilon(k)-global minimization of the Augmented Lagrangian with simple constraints, where epsilon(k) -> epsilon. Global convergence to an epsilon-global minimizer of the original problem is proved. The subproblems are solved using the alpha BB method. Numerical experiments are presented.
Resumo:
In the late seventies, Megiddo proposed a way to use an algorithm for the problem of minimizing a linear function a(0) + a(1)x(1) + ... + a(n)x(n) subject to certain constraints to solve the problem of minimizing a rational function of the form (a(0) + a(1)x(1) + ... + a(n)x(n))/(b(0) + b(1)x(1) + ... + b(n)x(n)) subject to the same set of constraints, assuming that the denominator is always positive. Using a rather strong assumption, Hashizume et al. extended Megiddo`s result to include approximation algorithms. Their assumption essentially asks for the existence of good approximation algorithms for optimization problems with possibly negative coefficients in the (linear) objective function, which is rather unusual for most combinatorial problems. In this paper, we present an alternative extension of Megiddo`s result for approximations that avoids this issue and applies to a large class of optimization problems. Specifically, we show that, if there is an alpha-approximation for the problem of minimizing a nonnegative linear function subject to constraints satisfying a certain increasing property then there is an alpha-approximation (1 1/alpha-approximation) for the problem of minimizing (maximizing) a nonnegative rational function subject to the same constraints. Our framework applies to covering problems and network design problems, among others.
Resumo:
In this note we discuss the convergence of Newton`s method for minimization. We present examples in which the Newton iterates satisfy the Wolfe conditions and the Hessian is positive definite at each step and yet the iterates converge to a non-stationary point. These examples answer a question posed by Fletcher in his 1987 book Practical methods of optimization.
Resumo:
Two Augmented Lagrangian algorithms for solving KKT systems are introduced. The algorithms differ in the way in which penalty parameters are updated. Possibly infeasible accumulation points are characterized. It is proved that feasible limit points that satisfy the Constant Positive Linear Dependence constraint qualification are KKT solutions. Boundedness of the penalty parameters is proved under suitable assumptions. Numerical experiments are presented.
Resumo:
Social and economical development is closely associated with technological innovation and a well-developed biotechnological industry. In the last few years, Brazil`s scientific production has been steadily increasing; however, the number of patents is lagging behind, with technological and translational research requiring governmental incentive and reinforcement. The Cell and Molecular Therapy Center (NUCEL) was created to develop activities in the translational research field, addressing concrete problems found in biomedical and veterinary areas and actively searching for solutions by employing a genetic engineering approach to generate cell lines over-expressing recombinant proteins to be transferred to local biotech companies, aiming at furthering the development of a national competence for local production of biopharmaceuticals of widespread use and of life-saving importance. To this end, mammalian cell engineering technologies were used to generate cell lines over-expressing several different recombinant proteins of biomedical and biotechnological interest, namely, recombinant human Amylin/IAPP for diabetes treatment, human FVIII and FIX clotting factors for hemophilia, human and bovine FSH for fertility and reproduction, and human bone repair proteins (BMPs). Expression of some of these proteins is also being sought with the baculovirus/insect cell system (BEVS) which, in many cases, is able to deliver high-yield production of recombinant proteins with biological activity comparable to that of mammalian systems, but in a much more cost-effective manner. Transfer of some of these recombinant products to local Biotech companies has been pursued by taking advantage of the Sao Paulo State Foundation (FAPESP) and Federal Government (FINEP, CNPq) incentives for joint Research Development and Innovation partnership projects.
Resumo:
Complex networks have been characterised by their specific connectivity patterns (network motifs), but their building blocks can also be identified and described by node-motifs-a combination of local network features. One technique to identify single node-motifs has been presented by Costa et al. (L. D. F. Costa, F. A. Rodrigues, C. C. Hilgetag, and M. Kaiser, Europhys. Lett., 87, 1, 2009). Here, we first suggest improvements to the method including how its parameters can be determined automatically. Such automatic routines make high-throughput studies of many networks feasible. Second, the new routines are validated in different network-series. Third, we provide an example of how the method can be used to analyse network time-series. In conclusion, we provide a robust method for systematically discovering and classifying characteristic nodes of a network. In contrast to classical motif analysis, our approach can identify individual components (here: nodes) that are specific to a network. Such special nodes, as hubs before, might be found to play critical roles in real-world networks.
Resumo:
The objective of this study was to estimate the first-order intrinsic kinetic constant (k(1)) and the liquid-phase mass transfer coefficient (k(c)) in a bench-scale anaerobic sequencing batch biofilm reactor (ASBBR) fed with glucose. A dynamic heterogeneous mathematical model, considering two phases (liquid and solid), was developed through mass balances in the liquid and solid phases. The model was adjusted to experimental data obtained from the ASBBR applied for the treatment of glucose-based synthetic wastewater with approximately 500 mg L-1 of glucose, operating in 8 h batch cycles, at 30 degrees C and 300 rpm. The values of the parameters obtained were 0.8911 min(-1) for k(1) and 0.7644 cm min(-1) for kc. The model was validated utilizing the estimated parameters with data obtained from the ASBBR operating in 3 h batch cycles, with a good representation of the experimental behavior. The solid-phase mass transfer flux was found to be the limiting step of the overall glucose conversion rate.
Resumo:
We describe a one-time signature scheme based on the hardness of the syndrome decoding problem, and prove it secure in the random oracle model. Our proposal can be instantiated on general linear error correcting codes, rather than restricted families like alternant codes for which a decoding trapdoor is known to exist. (C) 2010 Elsevier Inc. All rights reserved,
Resumo:
The cracking formation during the photodegradation of polypropylene (PP) plates (1 mm thickness), with (PPOx) and without pro-oxidant [PP), has been investigated. The plates were produced by extrusion in an industrial production line and were exposed to ultraviolet radiation in the laboratory for periods of up to 480 hr. The samples were investigated by infrared spectroscopy- FTIR, optical light microscopy, differential scanning calorimetry (DSC) and X-ray diffraction (XRD). The results showed that the extension of photodegradation process is more intense for PPOx than for PP samples. For both samples, cracks were formed at the surface perpendicularly to the flow-lines. However the cracks frequency was different for both samples and sides of sample. The crack frequency was correlated with chain orientation, A(110); it was shown that lower degrees of orientation resulted in lower crack frequency. POLYM. ENG. SCI., 48:365-372, 2008. (c) 2007 Society of Plastics Engineers.
Resumo:
The effect of ultraviolet exposure on the biodegration of poly(propylene) without (PP) and with 0.3 (wt/wt) (PPOx) pro-oxidant additives, produced by extrusion was studied. After UV exposure the samples were submitted to biodegradation (weight loss) in prepared soils. The samples before and after UV exposure were analyzed using differential scanning calorimetry, Fourier transform infrared spectroscopy, size exclusion chromatography, and optical microscopy. The exposure to UV radiation lead to more intense degradation of PPOx than of PP; the amount of carbonyl groups was larger for the PPOx samples than for PP, as well as the decrease in the T(m) and in the molecular weight. The samples exposed to UV radiation showed some level of fragmentation after 56 days when placed in the prepared soil; the samples which were exposed to UV for 480 h presented just a small weight loss. POLYM. ENG. SCI., 49:123-128, 2009. (C) 2008 Society of Plastics Engineers
Resumo:
Several MPC applications implement a control strategy in which some of the system outputs are controlled within specified ranges or zones, rather than at fixed set points [J.M. Maciejowski, Predictive Control with Constraints, Prentice Hall, New Jersey, 2002]. This means that these outputs will be treated as controlled variables only when the predicted future values lie outside the boundary of their corresponding zones. The zone control is usually implemented by selecting an appropriate weighting matrix for the output error in the control cost function. When an output prediction is inside its zone, the corresponding weight is zeroed, so that the controller ignores this output. When the output prediction lies outside the zone, the error weight is made equal to a specified value and the distance between the output prediction and the boundary of the zone is minimized. The main problem of this approach, as long as stability of the closed loop is concerned, is that each time an output is switched from the status of non-controlled to the status of controlled, or vice versa, a different linear controller is activated. Thus, throughout the continuous operation of the process, the control system keeps switching from one controller to another. Even if a stabilizing control law is developed for each of the control configurations, switching among stable controllers not necessarily produces a stable closed loop system. Here, a stable M PC is developed for the zone control of open-loop stable systems. Focusing on the practical application of the proposed controller, it is assumed that in the control structure of the process system there is an upper optimization layer that defines optimal targets to the system inputs. The performance of the proposed strategy is illustrated by simulation of a subsystem of an industrial FCC system. (C) 2008 Elsevier Ltd. All rights reserved.
Diagnostic errors and repetitive sequential classifications in on-line process control by attributes
Resumo:
The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
This work presents a method for predicting resource availability in opportunistic grids by means of use pattern analysis (UPA), a technique based on non-supervised learning methods. This prediction method is based on the assumption of the existence of several classes of computational resource use patterns, which can be used to predict the resource availability. Trace-driven simulations validate this basic assumptions, which also provide the parameter settings for the accurate learning of resource use patterns. Experiments made with an implementation of the UPA method show the feasibility of its use in the scheduling of grid tasks with very little overhead. The experiments also demonstrate the method`s superiority over other predictive and non-predictive methods. An adaptative prediction method is suggested to deal with the lack of training data at initialization. Further adaptative behaviour is motivated by experiments which show that, in some special environments, reliable resource use patterns may not always be detected. Copyright (C) 2009 John Wiley & Sons, Ltd.