963 resultados para Confiabilidade de Usinas Eólicas
Resumo:
Em sistema de integração lavoura-pecuária, os diversos cultivos em sistema plantio direto alteram os atributos físicos do solo, refletindo sobre a produtividade e a composição vegetal. O objetivo deste trabalho foi indicar atributos do solo com melhor correlação com a produtividade e teor de proteína da Brachiaria brizantha. Foram analisadas as correlações lineares e espaciais entre a produtividade de matéria seca (MS) e o teor de proteína bruta (PB) de Brachiaria brizantha (cv. Marandu) e de alguns atributos físicos de um Latossolo Vermelho distroférrico (Oxisol), em três profundidades (zero-0,10; 0,10-0,20; e 0,20-0,30 m). Para isso, foi instalada a malha geoestatística (entre 20 º 18 ' 05 '' S e 20 º 18 ' 28 '' S, e entre 52 º 39 ' 02 '' W e 52 º 40 ' 28 '' W) para a coleta dos dados, contendo 124 pontos amostrais, numa área de 4.000 m². A produtividade e o teor de proteína bruta de Brachiaria brizantha não variaram aleatoriamente e seguiram padrões espaciais bem definidos, com semivariogramas do tipo esférico, com alcances da dependência espacial, aproximadamente, entre 40 e 50 m. O aumento da resistência mecânica à penetração e da umidade do solo na camada superficial, no cultivo de Brachiaria brizantha, promove a redução do teor de proteína bruta e da matéria seca. A porosidade total na camada de 0,20-0,30 m, é um importante indicador da qualidade física do solo e apresenta confiabilidade para estimativa da produtividade de matéria seca de Brachiaria brizantha.
Resumo:
Java Card technology allows the development and execution of small applications embedded in smart cards. A Java Card application is composed of an external card client and of an application in the card that implements the services available to the client by means of an Application Programming Interface (API). Usually, these applications manipulate and store important information, such as cash and confidential data of their owners. Thus, it is necessary to adopt rigor on developing a smart card application to improve its quality and trustworthiness. The use of formal methods on the development of these applications is a way to reach these quality requirements. The B method is one of the many formal methods for system specification. The development in B starts with the functional specification of the system, continues with the application of some optional refinements to the specification and, from the last level of refinement, it is possible to generate code for some programming language. The B formalism has a good tool support and its application to Java Card is adequate since the specification and development of APIs is one of the major applications of B. The BSmart method proposed here aims to promote the rigorous development of Java Card applications up to the generation of its code, based on the refinement of its formal specification described in the B notation. This development is supported by the BSmart tool, that is composed of some programs that automate each stage of the method; and by a library of B modules and Java Card classes that model primitive types, essential Java Card API classes and reusable data structures
Resumo:
OBJETIVO: Avaliar qualidade do serviço prestado aos pacientes de cirurgia cardíaca no período hospitalar, em serviço do SUS, identificando as expectativas e percepções dos pacientes. Relacionar qualidade de serviço com gênero, faixa etária e circulação extracorpórea. MÉTODOS: Estudaram-se 82 pacientes (52,4% do sexo feminino e 47,6% do masculino) submetidos a cirurgia cardíaca eletiva, operados por toracotomia médio-esternal, idade: 31 a 83 anos (média 60,4 ± 13,2 anos), período: março a setembro de 2006. Avaliou-se a qualidade do serviço em dois momentos: expectativas no pré-operatório e percepções do atendimento recebido no 6º dia de pós-operatório; mediante aplicação da escala SERVQUAL modificada (SERVQUAL-Card). O resultado foi obtido pela diferença da somatória das notas das percepções e expectativas por meio de análise estatística. RESULTADOS: A escala SERVQUAL-Card foi validada estatisticamente, apresentando adequado índice de consistência interna. Encontrou-se maior frequência de revascularização do miocárdio 55 (67,0%); primeira cirurgia cardíaca 72 (87,8%) e utilização de CEC 69 (84,1%). Verificaram-se altos valores para expectativas e percepções, com resultados significantes (P<0,05). Observou-se relação significante entre qualidade de serviço com gênero, na empatia (P=0,04) e faixa etária, na confiabilidade (P=0,02). Não se observou significância entre CEC e qualidade de serviço. CONCLUSÃO: A qualidade dos serviços foi satisfatória. O paciente demonstrou expectativa alta ao serviço médicohospitalar. Mulheres apresentaram maior percepção da qualidade na empatia, jovens na confiabilidade. A utilização de CEC não está relacionada com qualidade do serviço nesta amostra. Os dados obtidos sugerem que a qualidade deste serviço de saúde pode ser monitorada pelo emprego periódico da escala SERQUAL.
Resumo:
The process for choosing the best components to build systems has become increasingly complex. It becomes more critical if it was need to consider many combinations of components in the context of an architectural configuration. These circumstances occur, mainly, when we have to deal with systems involving critical requirements, such as the timing constraints in distributed multimedia systems, the network bandwidth in mobile applications or even the reliability in real-time systems. This work proposes a process of dynamic selection of architectural configurations based on non-functional requirements criteria of the system, which can be used during a dynamic adaptation. This proposal uses the MAUT theory (Multi-Attribute Utility Theory) for decision making from a finite set of possibilities, which involve multiple criteria to be analyzed. Additionally, it was proposed a metamodel which can be used to describe the application s requirements in terms of the non-functional requirements criteria and their expected values, to express them in order to make the selection of the desired configuration. As a proof of concept, it was implemented a module that performs the dynamic choice of configurations, the MoSAC. This module was implemented using a component-based development approach (CBD), performing a selection of architectural configurations based on the proposed selection process involving multiple criteria. This work also presents a case study where an application was developed in the context of Digital TV to evaluate the time spent on the module to return a valid configuration to be used in a middleware with autoadaptative features, the middleware AdaptTV
Resumo:
Using formal methods, the developer can increase software s trustiness and correctness. Furthermore, the developer can concentrate in the functional requirements of the software. However, there are many resistance in adopting this software development approach. The main reason is the scarcity of adequate, easy to use, and useful tools. Developers typically write code and test it. These tests usually consist of executing the program and checking its output against its requirements. This, however, is not always an exhaustive discipline. On the other side, using formal methods one might be able to investigate the system s properties further. Unfortunately, specification languages do not always have tools like animators or simulators, and sometimes there are no friendly Graphical User Interfaces. On the other hand, specification languages usually have a compiler which normally generates a Labeled Transition System (LTS). This work proposes an application that provides graphical animation for formal specifications using the LTS as input. The application initially supports the languages B, CSP, and Z. However, using a LTS in a specified XML format, it is possible to animate further languages. Additionally, the tool provides traces visualization, the choices the user did, in a graphical tree. The intention is to improve the comprehension of a specification by providing information about errors and animating it, as the developers do for programming languages, such as Java and C++.
Resumo:
The development of smart card applications requires a high level of reliability. Formal methods provide means for this reliability to be achieved. The BSmart method and tool contribute to the development of smart card applications with the support of the B method, generating Java Card code from B specifications. For the development with BSmart to be effectively rigorous without overloading the user it is important to have a library of reusable components built in B. The goal of KitSmart is to provide this support. A first research about the composition of this library was a graduation work from Universidade Federal do Rio Grande do Norte, made by Thiago Dutra in 2006. This first version of the kit resulted in a specification of Java Card primitive types byte, short and boolean in B and the creation of reusable components for application development. This work provides an improvement of KitSmart with the addition of API Java Card specification made in B and a guide for the creation of new components. The API Java Card in B, besides being available to be used for development of applications, is also useful as a documentation of each API class. The reusable components correspond to modules to manipulate specific structures, such as date and time. These structures are not available for B or Java Card. These components for Java Card are generated from specifications formally verified in B. The guide contains quick reference on how to specify some structures and how some situations were adapted from object-orientation to the B Method. This work was evaluated through a case study made through the BSmart tool, that makes use of the KitSmart library. In this case study, it is possible to see the contribution of the components in a B specification. This kit should be useful for B method users and Java Card application developers
Uma abordagem para a verificação do comportamento excepcional a partir de regras de designe e testes
Resumo:
Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code
Resumo:
When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software
Resumo:
The component-based development of systems revolutionized the software development process, facilitating the maintenance, providing more confiability and reuse. Nevertheless, even with all the advantages of the development of components, their composition is an important concern. The verification through informal tests is not enough to achieve a safe composition, because they are not based on formal semantic models with which we are able to describe precisally a system s behaviour. In this context, formal methods provide ways to accurately specify systems through mathematical notations providing, among other benefits, more safety. The formal method CSP enables the specification of concurrent systems and verification of properties intrinsic to them, as well as the refinement among different models. Some approaches apply constraints using CSP, to check the behavior of composition between components, assisting in the verification of those components in advance. Hence, aiming to assist this process, considering that the software market increasingly requires more automation, reducing work and providing agility in business, this work presents a tool that automatizes the verification of composition among components, in which all complexity of formal language is kept hidden from users. Thus, through a simple interface, the tool BST (BRIC-Tool-Suport) helps to create and compose components, predicting, in advance, undesirable behaviors in the system, such as deadlocks
Resumo:
A Internet atual vem sofrendo vários problemas em termos de escalabilidade, desempenho, mobilidade, etc., devido ao vertiginoso incremento no número de usuários e o surgimento de novos serviços com novas demandas, propiciando assim o nascimento da Internet do Futuro. Novas propostas sobre redes orientadas a conteúdo, como a arquitetura Entidade Titulo (ETArch), proveem novos serviços para este tipo de cenários, implementados sobre o paradigma de redes definidas por software. Contudo, o modelo de transporte do ETArch é equivalente ao modelo best-effort da Internet atual, e vem limitando a confiabilidade das suas comunicações. Neste trabalho, ETArch é redesenhado seguindo o paradigma do sobreaprovisionamento de recursos para conseguir uma alocação de recursos avançada integrada com OpenFlow. Como resultado, o framework SMART (Suporte de Sessões Móveis com Alta Demanda de Recursos de Transporte), permite que a rede defina semanticamente os requisitos qualitativos das sessões para assim gerenciar o controle de Qualidade de Serviço visando manter a melhor Qualidade de Experiência possível. A avaliação do planos de dados e de controle teve lugar na plataforma de testes na ilha do projeto OFELIA, mostrando o suporte de aplicações móveis multimídia com alta demanda de recursos de transporte com QoS e QoE garantidos através de um esquema de sinalização restrito em comparação com o ETArch legado
Resumo:
In this work, the study of some complex systems is done with use of two distinct procedures. In the first part, we have studied the usage of Wavelet transform on analysis and characterization of (multi)fractal time series. We have test the reliability of Wavelet Transform Modulus Maxima method (WTMM) in respect to the multifractal formalism, trough the calculation of the singularity spectrum of time series whose fractality is well known a priori. Next, we have use the Wavelet Transform Modulus Maxima method to study the fractality of lungs crackles sounds, a biological time series. Since the crackles sounds are due to the opening of a pulmonary airway bronchi, bronchioles and alveoli which was initially closed, we can get information on the phenomenon of the airway opening cascade of the whole lung. Once this phenomenon is associated with the pulmonar tree architecture, which displays fractal geometry, the analysis and fractal characterization of this noise may provide us with important parameters for comparison between healthy lungs and those affected by disorders that affect the geometry of the tree lung, such as the obstructive and parenchymal degenerative diseases, which occurs, for example, in pulmonary emphysema. In the second part, we study a site percolation model for square lattices, where the percolating cluster grows governed by a control rule, corresponding to a method of automatic search. In this model of percolation, which have characteristics of self-organized criticality, the method does not use the automated search on Leaths algorithm. It uses the following control rule: pt+1 = pt + k(Rc − Rt), where p is the probability of percolation, k is a kinetic parameter where 0 < k < 1 and R is the fraction of percolating finite square lattices with side L, LxL. This rule provides a time series corresponding to the dynamical evolution of the system, in particular the likelihood of percolation p. We proceed an analysis of scaling of the signal obtained in this way. The model used here enables the study of the automatic search method used for site percolation in square lattices, evaluating the dynamics of their parameters when the system goes to the critical point. It shows that the scaling of , the time elapsed until the system reaches the critical point, and tcor, the time required for the system loses its correlations, are both inversely proportional to k, the kinetic parameter of the control rule. We verify yet that the system has two different time scales after: one in which the system shows noise of type 1 f , indicating to be strongly correlated. Another in which it shows white noise, indicating that the correlation is lost. For large intervals of time the dynamics of the system shows ergodicity
Resumo:
Due to its high resolution, Ground Penetrating Radar (GPR) has been used to image subsurface sedimentary deposits. Because GPR and Seismic methods share some principles of image construction, the classic seismostratigraphic interpretation method has been also applied as an attempt to interpret GPR data. Nonetheless some advances in few particular contexts, the adaptations from seismic to GPR of seismostratigraphic tools and concepts unsuitable because the meaning given to the termination criteria in seismic stratigraphy do not represent the adequate geologic record in the GPR scale. Essentially, the open question relies in proposing a interpretation method for GPR data which allow not only relating product and sedimentary process in the GPR scale but also identifying or proposing depositional environments and correlating these results with the well known Sequence Stratigraphy cornerstones. The goal of this dissertation is to propose an interpretation methodology of GPR data able to perform this task at least for siliciclastic deposits. In order to do so, the proposed GPR interpretation method is based both on seismostratigraphic concepts and on the bounding surface hierarchy tool from Miall (1988). As consequence of this joint use, the results of GPR interpretation can be associated to the sedimentary facies in a genetic context, so that it is possible to: (i) individualize radar facies and correlate them to the sedimentary facies by using depositional models; (ii) characterize a given depositional system, and (iii) determine its stratigraphic framework highligthing how it evolved through geologic time. To illustrate its use the proposed methodology was applied in a GPR data set from Galos area which is part of the Galinhos spit, located in Rio Grande do Norte state, Northeastern Brazil. This spit presents high lateral sedimentary facies variation, containing in its sedimentary record from 4th to 6th cicles caused by high frequency sea level oscillation. The interpretation process was done throughout the following phases: (i) identification of a vertical facies succession, (ii) characterization of radar facies and its associated sedimentary products, (iii) recognition of the associated sedimentary process in a genetic context, and finally (iv) proposal of an evolutionay model for the Galinhos spit. This model proposes that the Galinhos spit is a barrier island constituted, from base to top, of the following sedimentary facies: tidal channel facies, tidal flat facies, shore facies, and aeolic facies (dunes). The tidal channel facies, in the base, is constituted of lateral accretion bars and filling deposits of the channels. The base facies is laterally truncated by the tidal flat facies. In the foreshore zone, the tidal flat facies is covered by the shore facies which is the register of a sea transgression. Finally, on the top of the stratigraphic column, aeolic dunes are deposited due to areal exposition caused by a sea regression
Resumo:
The gravity inversion method is a mathematic process that can be used to estimate the basement relief of a sedimentary basin. However, the inverse problem in potential-field methods has neither a unique nor a stable solution, so additional information (other than gravity measurements) must be supplied by the interpreter to transform this problem into a well-posed one. This dissertation presents the application of a gravity inversion method to estimate the basement relief of the onshore Potiguar Basin. The density contrast between sediments and basament is assumed to be known and constant. The proposed methodology consists of discretizing the sedimentary layer into a grid of rectangular juxtaposed prisms whose thicknesses correspond to the depth to basement which is the parameter to be estimated. To stabilize the inversion I introduce constraints in accordance with the known geologic information. The method minimizes an objective function of the model that requires not only the model to be smooth and close to the seismic-derived model, which is used as a reference model, but also to honor well-log constraints. The latter are introduced through the use of logarithmic barrier terms in the objective function. The inversion process was applied in order to simulate different phases during the exploration development of a basin. The methodology consisted in applying the gravity inversion in distinct scenarios: the first one used only gravity data and a plain reference model; the second scenario was divided in two cases, we incorporated either borehole logs information or seismic model into the process. Finally I incorporated the basement depth generated by seismic interpretation into the inversion as a reference model and imposed depth constraint from boreholes using the primal logarithmic barrier method. As a result, the estimation of the basement relief in every scenario has satisfactorily reproduced the basin framework, and the incorporation of the constraints led to improve depth basement definition. The joint use of surface gravity data, seismic imaging and borehole logging information makes the process more robust and allows an improvement in the estimate, providing a result closer to the actual basement relief. In addition, I would like to remark that the result obtained in the first scenario already has provided a very coherent basement relief when compared to the known basin framework. This is significant information, when comparing the differences in the costs and environment impact related to gravimetric and seismic surveys and also the well drillings
Resumo:
This work presents geophysical and geological results obtained in a dunefield located in the east coast of Rio Grande do Norte State, with the aim to recognize the aeolian body depositional geometries to a future geologic modeling of the aeolian petroliferous reservoirs. The research, which was done in blowouts region situated at Nisia Floresta Municipally, included the characterization of external geometries with GPS and internal geometry analysis by GPR. Data was integrated in GoCAD software, where it was possible the three-dimensional characterization and interpretation of the studied deposits. The interpretation of GPR profiling allowed identifying: First-order bounding surfaces that separated the aeolian deposits of the Barreiras Formation rocks; Second-order bounding surfaces, which limit dune generations and Third-order bounding surfaces, a reactivation surface. This classification was based and adapted by the Brookfield (1977) and Kocurek (1996) propose. Four radarfacies was recognized: Radarfacies 1, progradational reflectors correlated to foresets of the dunes, Radarfacies 2, plain parallels reflectors related to sand sheets, Radarfacies 3, plain parallels reflectors associated to reworking of the blowout dune crest and Radarfacies 4, mounded reflectors associated to vegetated mound of sand or objects buried in subsurface. The GPR and GPS methods was also employed to the monitoring of dunefields susceptible to human activities in Buzios Beach, where the constructions along the blowout region and the tourism are changing the natural evolution of the deposits. This fact possibly to cause negative impacts to the coastal zone. Data obtained in Dunas Park, a unit environmental conservation, was compared with information of the Buzios Beach. There is a major tendency of erosion in Buzios, specifically in blowout corridor and blowout dune
Resumo:
thèse de maîtrise, qui a été idealizée et exécutée à partir de la réunion et les relations dialogiques établies entre la géographie et l'histoire, et l'espace et le temps, a considéré l'environnement urbain comme le thème général et, comme la référence empirique, l'espace urbain de Caicó, incrusté dans les terres semi-arides du Sertão do Seridó Potiguar, plus précisément à la mi-sud de l'État du Rio Grande do Norte. Dans cet espace, à travers de recherche des fragments de mémoires acquis de diverses sources historiques, on a tenté de faire des investigations sur les transformations urbaines qui ont eu lieu dans la ville au cours des années 50 et 60 du XXe siècle. Ces transformations, dans l espace urbain de Caicó à un moment du pic de l'activité de développement de coton, reflétaient et conditionnaient les propres projets de modernization urbaine qui représentants des élites locales eurent conçu avec l'objectif de la construction d une ville idéal dans les regions isolées du Seridó: moderne, civilisé, progressive et capitale régionale du Seridó. Pour cette construction, les nouveaux équipements et services urbains ont passé du plan imaginaire à l espace réel, installés dans plusieurs quartiers de la ville en transformant le paysage urbain. Face à cette situation, cinémas, magasins, station de radio, énergie électrique, institutions de l'éducation, téléphonie, maison de soins infirmiers, usine de bénéficiement de coton, banques, politiques urbaines, hygiéniques et sanitaires, modes de comportement, sociabilités urbaines distincttes, parmi d'autres éléments géographiques se sont institués à travers la réalisation de certains idéaux de progrès social et dans un esprit de modernité urbaine, dans de nouveaux espaces et pratiques inventés, tramés et éprouvés sur les rochers et entre le cours d eau au milieu d'un quotidien urbain marqué par ruptures et permanence de certaines coutumes et habitudes antiques et de certains paysages et environnement ridé