931 resultados para Software Product Lines


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Generalized hyper competitiveness in the world markets has determined the need to offer better products to potential and actual clients in order to mark an advantagefrom other competitors. To ensure the production of an adequate product, enterprises need to work on the efficiency and efficacy of their business processes (BPs) by means of the construction of Interactive Information Systems (IISs, including Interactive Multimedia Documents) so that they are processed more fluidly and correctly.The construction of the correct IIS is a major task that can only be successful if the needs from every intervenient are taken into account. Their requirements must bedefined with precision, extensively analyzed and consequently the system must be accurately designed in order to minimize implementation problems so that the IIS isproduced on schedule and with the fewer mistakes as possible. The main contribution of this thesis is the proposal of Goals, a software (engineering) construction process which aims at defining the tasks to be carried out in order to develop software. This process defines the stakeholders, the artifacts, and the techniques that should be applied to achieve correctness of the IIS. Complementarily, this process suggests two methodologies to be applied in the initial phases of the lifecycle of the Software Engineering process: Process Use Cases for the phase of requirements, and; MultiGoals for the phases of analysis and design. Process Use Cases is a UML-based (Unified Modeling Language), goal-driven and use case oriented methodology for the definition of functional requirements. It uses an information oriented strategy in order to identify BPs while constructing the enterprise’s information structure, and finalizes with the identification of use cases within the design of these BPs. This approach provides a useful tool for both activities of Business Process Management and Software Engineering. MultiGoals is a UML-based, use case-driven and architectural centric methodology for the analysis and design of IISs with support for Multimedia. It proposes the analysis of user tasks as the basis of the design of the: (i) user interface; (ii) the system behaviour that is modeled by means of patterns which can combine Multimedia and standard information, and; (iii) the database and media contents. This thesis makes the theoretic presentation of these approaches accompanied with examples from a real project which provide the necessary support for the understanding of the used techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a hybrid way mixing time and frequency domain for transmission lines modelling. The proposed methodology handles steady fundamental signal mixed with fast and slow transients, including impulsive and oscillatory behaviour. A transmission line model is developed based on lumped elements representation and state-space techniques. The proposed methodology represents an easy and practical procedure to model a three-phase transmission line directly in time domain, without the explicit use of inverse transforms. The proposed methodology takes into account the frequency-dependent parameters of the line, considering the soil and skin effects. In order to include this effect in the state matrices, a fitting method is applied. Furthermore the accuracy of proposed the developed model is verified, in frequency domain, by a simple methodology based on line distributed parameters and transfer function related to the input/output signals of the lumped parameters representation. In addition, this article proposes the use of a fast and robust analytic integration procedure to solve the state equations, enabling transient and steady-state simulations. The results are compared with those obtained by the commercial software Microtran (EMTP), taking into account a three-phase transmission line, typical in the Brazilian transmission system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motion estimation is the main responsible for data reduction in digital video encoding. It is also the most computational damanding step. H.264 is the newest standard for video compression and was planned to double the compression ratio achievied by previous standards. It was developed by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG) as the product of a partnership effort known as the Joint Video Team (JVT). H.264 presents novelties that improve the motion estimation efficiency, such as the adoption of variable block-size, quarter pixel precision and multiple reference frames. This work defines an architecture for motion estimation in hardware/software, using a full search algorithm, variable block-size and mode decision. This work consider the use of reconfigurable devices, soft-processors and development tools for embedded systems such as Quartus II, SOPC Builder, Nios II and ModelSim

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work shows a project method proposed to design and build software components from the software functional m del up to assembly code level in a rigorous fashion. This method is based on the B method, which was developed with support and interest of British Petroleum (BP). One goal of this methodology is to contribute to solve an important problem, known as The Verifying Compiler. Besides, this work describes a formal model of Z80 microcontroller and a real system of petroleum area. To achieve this goal, the formal model of Z80 was developed and documented, as it is one key component for the verification upto the assembly level. In order to improve the mentioned methodology, it was applied on a petroleum production test system, which is presented in this work. Part of this technique is performed manually. However, almost of these activities can be automated by a specific compiler. To build such compiler, the formal modelling of microcontroller and modelling of production test system should provide relevant knowledge and experiences to the design of a new compiler. In ummary, this work should improve the viability of one of the most stringent criteria for formal verification: speeding up the verification process, reducing design time and increasing the quality and reliability of the product of the final software. All these qualities are very important for systems that involve serious risks or in need of a high confidence, which is very common in the petroleum industry

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An EtOH extract of the leaves of Casearia sylvestris afforded new clerodane diterpene, casearin X, together with the known compounds casearins B, D, L, and O, and caseargrewiin F Casearin X degraded to the corresponding dialdehyde when stored in CDCl(3). The diterpenes isolated were cytotoxic to human cancer cell lines, with caseargrewiin F being the most active and the new clerodane, casearin X, the second active compound with IC(50) values comparable to the positive control doxorubicin. All isolated diterpenes showed lower activities against normal human cells than against cancer cell lines, which might indicate a possible selective action on cancer cells. Casearin X dialdehyde was not cytotoxic to cancer cells indicating that the occurrence of these CO groups at C(18) and C(19) is incompatible with the cytotoxic activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analisamos alguns modelos de fluxo de fluido utilizando o software gráfico F(C): Funções Complexas. Descrevemos as equações que expressam o potencial complexo, bem como a velocidade complexa para cada modelo. Os modelos estudados são fluxo uniforme, com fonte, com sumidouro, composto, circular e com obstáculo. Apresentamos o conceito de Domínio de Cores e o mecanismo de leitura dos gráficos. Cada modelo é apresentado de forma exemplificada, incluindo representações geométricas das curvas de fluxo e equipotenciais, bem como os gráficos do potencial complexo e velocidade complexa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to integrate an artificial intelligence (AI) technique, concretely rule-based processing, into mobile agents. In particular, it focuses on the aspects of designing and implementing an appropriate inference engine of small size to reduce migration costs. The main goal is combine two lines of agent research, First, the engineering oriented approach on mobile agent architectures, and, second, the AI related approach on inference engines driven by rules expressed in a restricted subset of first-order predicate logic (FOPL). In addition to size reduction, the main functions of this type of engine were isolated, generalized and implemented as dynamic components, making possible not only their migration with the agent, but also their dynamic migration and loading on demand. A set of classes for representing and exchanging knowledge between rule-based systems was also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work has as objectives the implementation of a intelligent computational tool to identify the non-technical losses and to select its most relevant features, considering information from the database with industrial consumers profiles of a power company. The solution to this problem is not trivial and not of regional character, the minimization of non-technical loss represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. This work presents using the WEKA software to the proposed objective, comparing various classification techniques and optimization through intelligent algorithms, this way, can be possible to automate applications on Smart Grids. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transactional memory (TM) is a new synchronization mechanism devised to simplify parallel programming, thereby helping programmers to unleash the power of current multicore processors. Although software implementations of TM (STM) have been extensively analyzed in terms of runtime performance, little attention has been paid to an equally important constraint faced by nearly all computer systems: energy consumption. In this work we conduct a comprehensive study of energy and runtime tradeoff sin software transactional memory systems. We characterize the behavior of three state-of-the-art lock-based STM algorithms, along with three different conflict resolution schemes. As a result of this characterization, we propose a DVFS-based technique that can be integrated into the resolution policies so as to improve the energy-delay product (EDP). Experimental results show that our DVFS-enhanced policies are indeed beneficial for applications with high contention levels. Improvements of up to 59% in EDP can be observed in this scenario, with an average EDP reduction of 16% across the STAMP workloads. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perhaps due to its origins in a production scheduling software called Optimised Production Technology (OPT), plus the idea of focusing on system constraints, many believe that the Theory of Constraints (TOC) has a vocation for optimal solutions. Those who assess TOC according to this perspective indicate that it guarantees an optimal solution only in certain circumstances. In opposition to this view and founded on a numeric example of a production mix problem, this paper shows, by means of TOC assumptions, why the TOC should not be compared to methods intended to seek optimal or the best solutions, but rather sufficiently good solutions, possible in non-deterministic environments. Moreover, we extend the range of relevant literature on product mix decision by introducing a heuristic based on the uniquely identified work that aims at achieving feasible solutions according to the TOC point of view. The heuristic proposed is tested on 100 production mix problems and the results are compared with the responses obtained with the use of Integer Linear Programming. The results show that the heuristic gives good results on average, but performance falls sharply in some situations. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)