140 resultados para Prova Rústica Tiradentes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formalization of logical systems in natural deduction brings many metatheoretical advantages, which Normalization proof is always highlighted. Modal logic systems, until very recently, were not routinely formalized in natural deduction, though some formulations and Normalization proofs are known. This work is a presentation of some important known systems of modal logic in natural deduction, and some Normalization procedures for them, but it is also and mainly a presentation of a hierarchy of modal logic systems in natural deduction, from K until S5, together with an outline of a Normalization proof for the system K, which is a model for Normalization in other systems

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This current discourse intends to prove how consciousness or Being-for-itself presents the idea that in its operation one can emphasize that it s Nothingness, transience and liberty in Jean-Paul Sartre s work O Ser e o Nada. To testify the proposed idea, the conception that Sartre gives to consciousness, representing nothingness, without any content that is connected to the possessor objects of a transphenomenal existence will be the starting point. This way, consciousness will be analyzed transcendently to the object that it s not, demonstrating its revealer-revealed condition, because it unveils a concrete world that exists against its idea, functioning as the revealer intentionality that there are beings instead of nothingness, obeying the ontological proof defended by Sartre. From this idea, every kind of consciousness will always be consciousness of something, a glance of the world, avoiding the fact that the consciousness could be considered nothing in the world. In order to live its original negation state of the world, apprehending this same world, with the purpose of a knowledge, it needs to be divided in two: the first degree consciousness or previous-reflexive cogito, that turns the proper reflexion possible, because, it s the consciousness proper transphenomenality of being different of all that connected to its existence being only its consequence; and the cogito, responsible to the positioning of the first degree consciousness, while aware of its own consciousness, that is, while being certain that knows. From this point, the way to untangle the consciousness or Being-for-itself will be developed as being Chasm, Liberty and transience. From this idea, it s intended to know how consciousness, that in Sartre s thoughts is originally nothing, could turn into Liberty that is presented in the field of transience? In other words, how these three internal structures imbricate one another to form consciousness in Sartre? First of all, it has to be considered the review of a conduct of human reality, the inquiry, that will be possible to understand how Nothingness exists as the mold of all kind of negation. After this, it will be shown, considering its way of existence, the human reality that is connected to Para-si, determined like nothingness, still is proposed like Liberty. Form this point, it will be possible to gleam how Liberty is lived deeply by Para-si in the shape of chasm, trying to how Para-si turns into nothingness, creating a chasm based on its proper liberty. Then, Liberty will be the proper mechanism used by Para-si to modificate its original chasm. The way Being-for-itself has to build to gain its goal will be projecting in transience, the building of something possible that brings one being back. However, will be demonstrated like Chasm can occupy the moment of a choice turning the decision instant an anguish stage, based on the failing of stability of the Being-for-itself, once nothingness persists in the field of the possibilities that human beings preserves in its essence while being essentially Liberty. From this idea, anguish will be studied as the proper consciousness of Liberty, being bad-faith the attempt of avoid Liberty trying to gain a shelter contradicting the fact that life is done of continual choices

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dental documentation or handbook is a collection of documents produced by the professional with diagnostic and therapeutical purpose where the inherent information to the buccal and general health of patients are registered. The register and proper filling of these documents, taking care of the ethical and legal requirements, provide to the dentist the possibility to contribute with justice in cases of human identification and makes of these documents an essential element of evidence in the ethical processes, administrative, civil and criminal against the dentists. Ahead of this fact, understanding such requirements and the importance of the dentist to register himself adequately, this research verified the knowledge of Natal (RN) City s dentists with relation to the elaboration of the dental handbook, investigating the concepts and the importance attributed to the handbook, identifying the documents more used and filed by these professionals, besides inquiring the legal value of filed documents and the filling time of these ones. The sample was constituted by 124 dentists, who had answered a questionnaire, after having been randomly selected ITom a list of professionals subscribed in the Dentistry Local Council/RN Section. The analysis of the results showed that majority of the participant citÍzens (52,3%) confers to the dental documentation the clinical importance, followed by the legal and forensic-dentistry importance; 59,3% of the searched professionals do not distinguish satisfactorily or they do not observe differences between the dental handbook and the clinical filing card, the X-rays, the dental certificates, the prescriptions, the directions and the receipts; between the documents of common use to clínical and specialist ones, the contract of rendering of services and term of ITee and cleared up consent are the documents less used by the professionals. It was still verified, that only 13,1% of the sample register the signature of the patients in the clinical filing card, making it more credibility to be presented in judgement. In the same way, copies of dental certificates and prescriptions evaluated and signed by the patients are filed respectively by only 13,5% and 9,4% ofthe searched professionals and 50% ofthe sample, keep these documents filed for an indeterminate period of time, that is, these professionals have the guard of the handbook and they do not intend to disdain it, although 85,5% of the sample does not recognize the real proprietor of the handbook. It is concluded that a great part of the dentists is unaware about the importance of the dental documentation, and neglect its elaboration, leaving themselves exposed to several kinds of penalties foreseen in the legislation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was determine the prevalence and antimicrobial susceptibility of Staphylococcus spp. from patients with periodontal disease and periodontally healthy, correlate them with factors to host, local environment and traits of the diseases. To this, thirty adults from 19 to 55 years old were selected. They had not periodontal treatment and no antibiotic or antimicrobial was administered during three previous months. From these individuals, sites periodontally healthy, with chronic gingivitis and/or periodontitis were analyzed. Eighteen subgingival dental biofilm samples were collected through sterile paper points being six from each tooth randomly selected, representing conditions mentioned. They were transported to Oral Microbiology laboratory, plated onto Mannitol Salt Agar (MSA) and incubated at 370C in air for 48 h. Staphylococcus spp. were identified by colonial morphology, Gram stain, catalase reaction, susceptibility to bacitracin and coagulase activity. After identification, strains were submitted to the antibiotic susceptibility test with 12 antimicrobials, based on Kirby-Bauer technique. To establish the relation between coagulase-negative Staphylococcus (CSN) presence and their infection levels and host factors, local environment and traits of diseases were used Chi-square, Mann-Whitney and Kruskal-Wallis tests to a confidence level of 95%. 86,7% subjects harbored CSN in 11,7% periodontal sites. These prevalence were 12,1% in healthy sites, 11,7% in chronic gingivitis, 13,5% in slight chronic periodontitis, 6,75% in moderate chronic periodontitis and in sites with advance chronic periodontitis was not isolated CSN, without difference among them (p = 0,672). There was no significant difference to presence and infection levels of CSN as related to host factors, local environm ent and traits of the diseases. Amongst the 74 samples of CSN isolated, the biggest resistance was observed to penicillin (55,4%), erythromycin (32,4%), tetracycline (12,16%) and clindamycin (9,4%). 5,3% of the isolates were resistant to oxacilin and methicillin. No resistance was observed to ciprofloxacin, rifampicin and vancomycin. It was concluded that staphylococci are found in low numbers in healthy or sick periodontal sites in a similar ratio. However, a trend was observed to a reduction in staphylococci occurrence toward more advanced stages of the disease. This low prevalence was not related to any variables analyzed. Susceptibility profile to antibiotics demonstrates a raised resistance to penicillin and a low one to methicillin. To erythromycin, tetracycline and clindamycin was observed a significant resistance

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In most cultures, dreams are believed to predict the future on occasion. Several neurophysiological studies indicate that the function of sleep and dreams is to consolidate and transform memories, in a cyclical process of creation, selection and generalization of conjectures about the reality. The aim of the research presented here was to investigate the possible adaptative role of anticipatory dreams. We sought to determine the relationship between dream and waking in a context in which the adaptive success of the individual was really at risk, in order to mobilize more strongly the oneiric activity. We used the entrance examination of the Federal University of Rio Grande do Norte (UFRN) as a significant waking event in which performance could be independently quantified. Through a partnership with UFRN, we contacted by e-mail 3000 candidates to the 2009 examination. In addition, 150 candidates were approached personally. Candidates who agreed to participate in the study (n = 94) completed questionnaires specific to the examination and were asked to describe their dreams during the examinaton period. The examination performance of each candidate in the entrance examination was provided by the UFRN to the researcher. A total of 45 participants reported dreams related to the examination. Our results show a positive correlation between performance on the examination and anticipatory dreams with the event, both in the comparison of performance on objective and discursive, and in final approval (in the group that not dreamed with the exam the rate of general approval, 22,45%, was similar to that found in the selection process as a whole, 22.19%, while for the group that dreamed with the examination that rate was 35.56%). The occurrence of anticipatory dreams reflectes increased concern during waking (psychobiological mobilization) related to the future event, as indicated by higher scores of fear and apprehension, and major changes in daily life, in patterns of mood and sleep, in the group that reported testrelated dreams. Furthermore, the data suggest a role of dreams in the determination of environmentally relevant behavior of the vigil, simulating possible scenarios of success (dream with approval) and failure (nightmares) to maximize the adaptive success of the individual

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O método de combinação de Nelson-Oppen permite que vários procedimentos de decisão, cada um projetado para uma teoria específica, possam ser combinados para inferir sobre teorias mais abrangentes, através do princípio de propagação de igualdades. Provadores de teorema baseados neste modelo são beneficiados por sua característica modular e podem evoluir mais facilmente, incrementalmente. Difference logic é uma subteoria da aritmética linear. Ela é formada por constraints do tipo x − y ≤ c, onde x e y são variáveis e c é uma constante. Difference logic é muito comum em vários problemas, como circuitos digitais, agendamento, sistemas temporais, etc. e se apresenta predominante em vários outros casos. Difference logic ainda se caracteriza por ser modelada usando teoria dos grafos. Isto permite que vários algoritmos eficientes e conhecidos da teoria de grafos possam ser utilizados. Um procedimento de decisão para difference logic é capaz de induzir sobre milhares de constraints. Um procedimento de decisão para a teoria de difference logic tem como objetivo principal informar se um conjunto de constraints de difference logic é satisfatível (as variáveis podem assumir valores que tornam o conjunto consistente) ou não. Além disso, para funcionar em um modelo de combinação baseado em Nelson-Oppen, o procedimento de decisão precisa ter outras funcionalidades, como geração de igualdade de variáveis, prova de inconsistência, premissas, etc. Este trabalho apresenta um procedimento de decisão para a teoria de difference logic dentro de uma arquitetura baseada no método de combinação de Nelson-Oppen. O trabalho foi realizado integrando-se ao provador haRVey, de onde foi possível observar o seu funcionamento. Detalhes de implementação e testes experimentais são relatados

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The process for choosing the best components to build systems has become increasingly complex. It becomes more critical if it was need to consider many combinations of components in the context of an architectural configuration. These circumstances occur, mainly, when we have to deal with systems involving critical requirements, such as the timing constraints in distributed multimedia systems, the network bandwidth in mobile applications or even the reliability in real-time systems. This work proposes a process of dynamic selection of architectural configurations based on non-functional requirements criteria of the system, which can be used during a dynamic adaptation. This proposal uses the MAUT theory (Multi-Attribute Utility Theory) for decision making from a finite set of possibilities, which involve multiple criteria to be analyzed. Additionally, it was proposed a metamodel which can be used to describe the application s requirements in terms of the non-functional requirements criteria and their expected values, to express them in order to make the selection of the desired configuration. As a proof of concept, it was implemented a module that performs the dynamic choice of configurations, the MoSAC. This module was implemented using a component-based development approach (CBD), performing a selection of architectural configurations based on the proposed selection process involving multiple criteria. This work also presents a case study where an application was developed in the context of Digital TV to evaluate the time spent on the module to return a valid configuration to be used in a middleware with autoadaptative features, the middleware AdaptTV

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of increasingly complex software applications is demanding greater investment in the development of such systems to ensure applications with better quality. Therefore, new techniques are being used in Software Engineering, thus making the development process more effective. Among these new approaches, we highlight Formal Methods, which use formal languages that are strongly based on mathematics and have a well-defined semantics and syntax. One of these languages is Circus, which can be used to model concurrent systems. It was developed from the union of concepts from two other specification languages: Z, which specifies systems with complex data, and CSP, which is normally used to model concurrent systems. Circus has an associated refinement calculus, which can be used to develop software in a precise and stepwise fashion. Each step is justified by the application of a refinement law (possibly with the discharge of proof obligations). Sometimes, the same laws can be applied in the same manner in different developments or even in different parts of a single development. A strategy to optimize this calculus is to formalise these application as a refinement tactic, which can then be used as a single transformation rule. CRefine was developed to support the Circus refinement calculus. However, before the work presented here, it did not provide support for refinement tactics. The aim of this work is to provide tool support for refinement tactics. For that, we develop a new module in CRefine, which automates the process of defining and applying refinement tactics that are formalised in the tactic language ArcAngelC. Finally, we validate the extension by applying the new module in a case study, which used the refinement tactics in a refinement strategy for verification of SPARK Ada implementations of control systems. In this work, we apply our module in the first two phases of this strategy

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PLCs (acronym for Programmable Logic Controllers) perform control operations, receiving information from the environment, processing it and modifying this same environment according to the results produced. They are commonly used in industry in several applications, from mass transport to petroleum industry. As the complexity of these applications increase, and as various are safety critical, a necessity for ensuring that they are reliable arouses. Testing and simulation are the de-facto methods used in the industry to do so, but they can leave flaws undiscovered. Formal methods can provide more confidence in an application s safety, once they permit their mathematical verification. We make use of the B Method, which has been successfully applied in the formal verification of industrial systems, is supported by several tools and can handle decomposition, refinement, and verification of correctness according to the specification. The method we developed and present in this work automatically generates B models from PLC programs and verify them in terms of safety constraints, manually derived from the system requirements. The scope of our method is the PLC programming languages presented in the IEC 61131-3 standard, although we are also able to verify programs not fully compliant with the standard. Our approach aims to ease the integration of formal methods in the industry through the abbreviation of the effort to perform formal verification in PLCs

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web services are software accessible via the Internet that provide functionality to be used by applications. Today, it is natural to reuse third-party services to compose new services. This process of composition can occur in two styles, called orchestration and choreography. A choreography represents a collaboration between services which know their partners in the composition, to achieve the service s desired functionality. On the other hand, an orchestration have a central process (the orchestrator) that coordinates all application operations. Our work is placed in this latter context, by proposing an abstract model for running service orchestrations. For this purpose, a graph reduction machine will be defined for the implementation of service orchestrations specified in a variant of the PEWS composition language. Moreover, a prototype of this machine (in Java) is built as a proof of concept

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Self-adaptive software system is able to change its structure and/or behavior at runtime due to changes in their requirements, environment or components. One way to archieve self-adaptation is the use a sequence of actions (known as adaptation plans) which are typically defined at design time. This is the approach adopted by Cosmos - a Framework to support the configuration and management of resources in distributed environments. In order to deal with the variability inherent of self-adaptive systems, such as, the appearance of new components that allow the establishment of configurations that were not envisioned at development time, this dissertation aims to give Cosmos the capability of generating adaptation plans of runtime. In this way, it was necessary to perform a reengineering of the Cosmos Framework in order to allow its integration with a mechanism for the dynamic generation of adaptation plans. In this context, our work has been focused on conducting a reengineering of Cosmos. Among the changes made to in the Cosmos, we can highlight: changes in the metamodel used to represent components and applications, which has been redefined based on an architectural description language. These changes were propagated to the implementation of a new Cosmos prototype, which was then used for developing a case study application for purpose of proof of concept. Another effort undertaken was to make Cosmos more attractive by integrating it with another platform, in the case of this dissertation, the OSGi platform, which is well-known and accepted by the industry