137 resultados para prova sperimentalestruttura prefabbricatacollegamento trve-pilastrocemento armato
Resumo:
The aim of this study was determine the prevalence and antimicrobial susceptibility of Staphylococcus spp. from patients with periodontal disease and periodontally healthy, correlate them with factors to host, local environment and traits of the diseases. To this, thirty adults from 19 to 55 years old were selected. They had not periodontal treatment and no antibiotic or antimicrobial was administered during three previous months. From these individuals, sites periodontally healthy, with chronic gingivitis and/or periodontitis were analyzed. Eighteen subgingival dental biofilm samples were collected through sterile paper points being six from each tooth randomly selected, representing conditions mentioned. They were transported to Oral Microbiology laboratory, plated onto Mannitol Salt Agar (MSA) and incubated at 370C in air for 48 h. Staphylococcus spp. were identified by colonial morphology, Gram stain, catalase reaction, susceptibility to bacitracin and coagulase activity. After identification, strains were submitted to the antibiotic susceptibility test with 12 antimicrobials, based on Kirby-Bauer technique. To establish the relation between coagulase-negative Staphylococcus (CSN) presence and their infection levels and host factors, local environment and traits of diseases were used Chi-square, Mann-Whitney and Kruskal-Wallis tests to a confidence level of 95%. 86,7% subjects harbored CSN in 11,7% periodontal sites. These prevalence were 12,1% in healthy sites, 11,7% in chronic gingivitis, 13,5% in slight chronic periodontitis, 6,75% in moderate chronic periodontitis and in sites with advance chronic periodontitis was not isolated CSN, without difference among them (p = 0,672). There was no significant difference to presence and infection levels of CSN as related to host factors, local environm ent and traits of the diseases. Amongst the 74 samples of CSN isolated, the biggest resistance was observed to penicillin (55,4%), erythromycin (32,4%), tetracycline (12,16%) and clindamycin (9,4%). 5,3% of the isolates were resistant to oxacilin and methicillin. No resistance was observed to ciprofloxacin, rifampicin and vancomycin. It was concluded that staphylococci are found in low numbers in healthy or sick periodontal sites in a similar ratio. However, a trend was observed to a reduction in staphylococci occurrence toward more advanced stages of the disease. This low prevalence was not related to any variables analyzed. Susceptibility profile to antibiotics demonstrates a raised resistance to penicillin and a low one to methicillin. To erythromycin, tetracycline and clindamycin was observed a significant resistance
Resumo:
In most cultures, dreams are believed to predict the future on occasion. Several neurophysiological studies indicate that the function of sleep and dreams is to consolidate and transform memories, in a cyclical process of creation, selection and generalization of conjectures about the reality. The aim of the research presented here was to investigate the possible adaptative role of anticipatory dreams. We sought to determine the relationship between dream and waking in a context in which the adaptive success of the individual was really at risk, in order to mobilize more strongly the oneiric activity. We used the entrance examination of the Federal University of Rio Grande do Norte (UFRN) as a significant waking event in which performance could be independently quantified. Through a partnership with UFRN, we contacted by e-mail 3000 candidates to the 2009 examination. In addition, 150 candidates were approached personally. Candidates who agreed to participate in the study (n = 94) completed questionnaires specific to the examination and were asked to describe their dreams during the examinaton period. The examination performance of each candidate in the entrance examination was provided by the UFRN to the researcher. A total of 45 participants reported dreams related to the examination. Our results show a positive correlation between performance on the examination and anticipatory dreams with the event, both in the comparison of performance on objective and discursive, and in final approval (in the group that not dreamed with the exam the rate of general approval, 22,45%, was similar to that found in the selection process as a whole, 22.19%, while for the group that dreamed with the examination that rate was 35.56%). The occurrence of anticipatory dreams reflectes increased concern during waking (psychobiological mobilization) related to the future event, as indicated by higher scores of fear and apprehension, and major changes in daily life, in patterns of mood and sleep, in the group that reported testrelated dreams. Furthermore, the data suggest a role of dreams in the determination of environmentally relevant behavior of the vigil, simulating possible scenarios of success (dream with approval) and failure (nightmares) to maximize the adaptive success of the individual
Resumo:
It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it
Resumo:
O método de combinação de Nelson-Oppen permite que vários procedimentos de decisão, cada um projetado para uma teoria específica, possam ser combinados para inferir sobre teorias mais abrangentes, através do princípio de propagação de igualdades. Provadores de teorema baseados neste modelo são beneficiados por sua característica modular e podem evoluir mais facilmente, incrementalmente. Difference logic é uma subteoria da aritmética linear. Ela é formada por constraints do tipo x − y ≤ c, onde x e y são variáveis e c é uma constante. Difference logic é muito comum em vários problemas, como circuitos digitais, agendamento, sistemas temporais, etc. e se apresenta predominante em vários outros casos. Difference logic ainda se caracteriza por ser modelada usando teoria dos grafos. Isto permite que vários algoritmos eficientes e conhecidos da teoria de grafos possam ser utilizados. Um procedimento de decisão para difference logic é capaz de induzir sobre milhares de constraints. Um procedimento de decisão para a teoria de difference logic tem como objetivo principal informar se um conjunto de constraints de difference logic é satisfatível (as variáveis podem assumir valores que tornam o conjunto consistente) ou não. Além disso, para funcionar em um modelo de combinação baseado em Nelson-Oppen, o procedimento de decisão precisa ter outras funcionalidades, como geração de igualdade de variáveis, prova de inconsistência, premissas, etc. Este trabalho apresenta um procedimento de decisão para a teoria de difference logic dentro de uma arquitetura baseada no método de combinação de Nelson-Oppen. O trabalho foi realizado integrando-se ao provador haRVey, de onde foi possível observar o seu funcionamento. Detalhes de implementação e testes experimentais são relatados
Resumo:
The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML
Resumo:
The process for choosing the best components to build systems has become increasingly complex. It becomes more critical if it was need to consider many combinations of components in the context of an architectural configuration. These circumstances occur, mainly, when we have to deal with systems involving critical requirements, such as the timing constraints in distributed multimedia systems, the network bandwidth in mobile applications or even the reliability in real-time systems. This work proposes a process of dynamic selection of architectural configurations based on non-functional requirements criteria of the system, which can be used during a dynamic adaptation. This proposal uses the MAUT theory (Multi-Attribute Utility Theory) for decision making from a finite set of possibilities, which involve multiple criteria to be analyzed. Additionally, it was proposed a metamodel which can be used to describe the application s requirements in terms of the non-functional requirements criteria and their expected values, to express them in order to make the selection of the desired configuration. As a proof of concept, it was implemented a module that performs the dynamic choice of configurations, the MoSAC. This module was implemented using a component-based development approach (CBD), performing a selection of architectural configurations based on the proposed selection process involving multiple criteria. This work also presents a case study where an application was developed in the context of Digital TV to evaluate the time spent on the module to return a valid configuration to be used in a middleware with autoadaptative features, the middleware AdaptTV
Resumo:
This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition
Resumo:
The use of increasingly complex software applications is demanding greater investment in the development of such systems to ensure applications with better quality. Therefore, new techniques are being used in Software Engineering, thus making the development process more effective. Among these new approaches, we highlight Formal Methods, which use formal languages that are strongly based on mathematics and have a well-defined semantics and syntax. One of these languages is Circus, which can be used to model concurrent systems. It was developed from the union of concepts from two other specification languages: Z, which specifies systems with complex data, and CSP, which is normally used to model concurrent systems. Circus has an associated refinement calculus, which can be used to develop software in a precise and stepwise fashion. Each step is justified by the application of a refinement law (possibly with the discharge of proof obligations). Sometimes, the same laws can be applied in the same manner in different developments or even in different parts of a single development. A strategy to optimize this calculus is to formalise these application as a refinement tactic, which can then be used as a single transformation rule. CRefine was developed to support the Circus refinement calculus. However, before the work presented here, it did not provide support for refinement tactics. The aim of this work is to provide tool support for refinement tactics. For that, we develop a new module in CRefine, which automates the process of defining and applying refinement tactics that are formalised in the tactic language ArcAngelC. Finally, we validate the extension by applying the new module in a case study, which used the refinement tactics in a refinement strategy for verification of SPARK Ada implementations of control systems. In this work, we apply our module in the first two phases of this strategy
Resumo:
PLCs (acronym for Programmable Logic Controllers) perform control operations, receiving information from the environment, processing it and modifying this same environment according to the results produced. They are commonly used in industry in several applications, from mass transport to petroleum industry. As the complexity of these applications increase, and as various are safety critical, a necessity for ensuring that they are reliable arouses. Testing and simulation are the de-facto methods used in the industry to do so, but they can leave flaws undiscovered. Formal methods can provide more confidence in an application s safety, once they permit their mathematical verification. We make use of the B Method, which has been successfully applied in the formal verification of industrial systems, is supported by several tools and can handle decomposition, refinement, and verification of correctness according to the specification. The method we developed and present in this work automatically generates B models from PLC programs and verify them in terms of safety constraints, manually derived from the system requirements. The scope of our method is the PLC programming languages presented in the IEC 61131-3 standard, although we are also able to verify programs not fully compliant with the standard. Our approach aims to ease the integration of formal methods in the industry through the abbreviation of the effort to perform formal verification in PLCs
Resumo:
Web services are software accessible via the Internet that provide functionality to be used by applications. Today, it is natural to reuse third-party services to compose new services. This process of composition can occur in two styles, called orchestration and choreography. A choreography represents a collaboration between services which know their partners in the composition, to achieve the service s desired functionality. On the other hand, an orchestration have a central process (the orchestrator) that coordinates all application operations. Our work is placed in this latter context, by proposing an abstract model for running service orchestrations. For this purpose, a graph reduction machine will be defined for the implementation of service orchestrations specified in a variant of the PEWS composition language. Moreover, a prototype of this machine (in Java) is built as a proof of concept
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Self-adaptive software system is able to change its structure and/or behavior at runtime due to changes in their requirements, environment or components. One way to archieve self-adaptation is the use a sequence of actions (known as adaptation plans) which are typically defined at design time. This is the approach adopted by Cosmos - a Framework to support the configuration and management of resources in distributed environments. In order to deal with the variability inherent of self-adaptive systems, such as, the appearance of new components that allow the establishment of configurations that were not envisioned at development time, this dissertation aims to give Cosmos the capability of generating adaptation plans of runtime. In this way, it was necessary to perform a reengineering of the Cosmos Framework in order to allow its integration with a mechanism for the dynamic generation of adaptation plans. In this context, our work has been focused on conducting a reengineering of Cosmos. Among the changes made to in the Cosmos, we can highlight: changes in the metamodel used to represent components and applications, which has been redefined based on an architectural description language. These changes were propagated to the implementation of a new Cosmos prototype, which was then used for developing a case study application for purpose of proof of concept. Another effort undertaken was to make Cosmos more attractive by integrating it with another platform, in the case of this dissertation, the OSGi platform, which is well-known and accepted by the industry
Resumo:
Due to the constantly increasing use of wireless networks in domestic, business and industrial environments, new challenges have emerged. The prototyping of new protocols in these environments is typically restricted to simulation environments, where there is the need of double implementation, one in the simulation environment where an initial proof of concept is performed and the other one in a real environment. Also, if real environments are used, it is not trivial to create a testbed for high density wireless networks given the need to use various real equipment as well as attenuators and power reducers to try to reduce the physical space required to create these laboratories. In this context, LVWNet (Linux Virtual Wireless Network) project was originally designed to create completely virtual testbeds for IEEE 802.11 networks on the Linux operating system. This paper aims to extend the current project LVWNet, adding to it the features like the ability to interact with real wireless hardware, provides a initial mobility ability using the positioning of the nodes in a space coordinates environment based on meters, with loss calculations due to attenuation in free space, enables some scalability increase by creating an own protocol that allows the communication between nodes without an intermediate host and dynamic registration of nodes, allowing new nodes to be inserted into in already in operation network
Resumo:
In this work the use of coconut fiber (coir) and bamboo shafts as reinforcement of soil-cement was studied, in order to obtain an alternative material to make stakes for fences in rural properties. The main objective was to study the effect of the addition of reinforcement to the soil-cement matrix. The effect of humidity on the mechanical properties was also analyzed. The soil-cement mortar was composed by a mixture, in equal parts, of soil and river sand, 14% in weight of cement and 10 % in weight of water. As reinforcement, different combinations of (a) coconut fiber with 15 mm mean length (0,3 %, 0,6 % and 1,2 % in weight) and (b) bamboo shafts, also in crescent quantities (2, 4 and 8 shafts per specimen) were used. For each combination 6 specimens were made and these were submitted to three point flexural test after 28 days of cure. In order to evaluate the effect of humidity, 1 specimen from each of the coconut fiber reinforced combination was immersed in water 24 hours prior to flexural test. The results of the tests carried out indicated that the addition of the reinforcement affected negatively the mechanical resistance and, on the other hand, increased the tenacity and the ductility of the material.
Resumo:
The problem treated in this dissertation is to establish boundedness for the iterates of an iterative algorithm
in