939 resultados para protocolos de sincronização


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The regeneration of bone defects with loss of substance remains as a therapeutic challenge in the medical field. There are basically four types of grafts: autologous, allogenic, xenogenic and isogenic. It is a consensus that autologous bone is the most suitable material for this purpose, but there are limitations to its use, especially the insufficient amount in the donor. Surveys show that the components of the extracellular matrix (ECM) are generally conserved between different species and are well tolerated even in xenogenic recipient. Thus, several studies have been conducted in the search for a replacement for autogenous bone scaffold using the technique of decellularization. To obtain these scaffolds, tissue must undergo a process of cell removal that causes minimal adverse effects on the composition, biological activity and mechanical integrity of the remaining extracellular matrix. There is not, however, a conformity among researchers about the best protocol for decellularization, since each of these treatments interfere differently in biochemical composition, ultrastructure and mechanical properties of the extracellular matrix, affecting the type of immune response to the material. Further down the arsenal of research involving decellularization bone tissue represents another obstacle to the arrival of a consensus protocol. The present study aimed to evaluate the influence of decellularization methods in the production of biological scaffolds from skeletal organs of mice, for their use for grafting. This was a laboratory study, sequenced in two distinct stages. In the first phase 12 mice hemi-calvariae were evaluated, divided into three groups (n = 4) and submitted to three different decellularization protocols (SDS [group I], trypsin [Group II], Triton X-100 [Group III]). We tried to identify the one that promotes most efficient cell removal, simultaneously to the best structural preservation of the bone extracellular matrix. Therefore, we performed quantitative analysis of the number of remaining cells and descriptive analysis of the scaffolds, made possible by microscopy. In the second stage, a study was conducted to evaluate the in vitro adhesion of mice bone marrow mesenchymal cells, cultured on these scaffolds, previously decellularized. Through manual counting of cells on scaffolds there was a complete cell removal in Group II, Group I showed a practically complete cell removal, and Group III displayed cell remains. The findings allowed us to observe a significant difference only between Groups II and III (p = 0.042). Better maintenance of the collagen structure was obtained with Triton X-100, whereas the decellularization with Trypsin was responsible for the major structural changes in the scaffolds. After culture, the adhesion of mesenchymal cells was only observed in specimens deccelularized with Trypsin. Due to the potential for total removal of cells and the ability to allow adherence of these, the protocol based on the use of Trypsin (Group II) was considered the most suitable for use in future experiments involving bone grafting decellularized scaffolds

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The evaluation criteria of the cases treated with dental implants are based on clinical and radiographic tests. In this context it is important to conduct research to determine prognosis of different types of prosthetic rehabilitation and determination of the main problems affecting this type of treatment. Thus, the objective of this study was to assess the prosthetic conditions of individuals rehabilitated with implant-supported prosthesis. In this cross-sectional study 153 patients were treated, accounting for a sample of 509 implants. The failures were observed by clinical and radiographic examination. The results showed that the fracture (0.2%) loss (0.4%) and loosening of the screws (3.3%) were failures are less frequent. The fracture structures as the resin (12.4%), porcelain (5.5%) and metallic (1.5%), loss of resin that covers the screw (23.8%) and loss of retention overdentures on implants (18.6%) had a higher occurrence. The failure of adaptation between the abutment and the implant (6.9%) and especially between the prosthesis and the abutment (25.4%) had a high prevalence and, when related to other parameters showed a significant association, particularly with the cemented prosthesis (OR = 6.79). It can be concluded that to minimize the appearance of failures, protocols must be observed from diagnosis to the settlement and control of prostheses on implants, particularly with respect to technical steps of the making of the prosthesis and care in radiographic evaluating the fit between their components

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A uveíte peri e pós-operatória é o maior problema da cirurgia para extração de catarata no cão, sendo considerada o fator mais importante para o sucesso cirúrgico, imediato e tardio. Diversos protocolos pré e pós-operatórios utilizando agentes anti-inflamatórios esteroidais e não-esteroidais têm sido empregados na tentativa de controle da uveíte cirurgicamente induzida. O objetivo do presente estudo foi avaliar a reação inflamatória pós-operatória, clinicamente e por meio da pressão intraocular (PIO), após a cirurgia de facoemulsificação para extração de catarata em cães, com e sem implante de lente intraocular (LIO) em piggyback. Empregaram-se, 25 cães portadores de catarata, subdivididos em dois grupos: G1 (com implante de LIO), G2 (sem implante de LIO). A técnica cirúrgica adotada foi a facoemulsificação bimanual unilateral. Avaliações clínicas e mensurações da PIO foram aferidas antes do procedimento cirúrgico (0) e nos tempos 3, 7, 14, 21, 28 e 60 dias após o ato cirúrgico. Cães do grupo G1 apresentaram sinais clínicos de uveíte visivelmente mais intensos, relativamente aos do G2. Entretanto, a PIO não demonstrou diferença significativa entre os dois grupos analisados, nem entre os olhos operados e os contralaterais. A utilização de duas LIOs humanas em piggyback no cão é exequível, porém suscita mais inflamação e complicações no pós-operatório.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On the last years, several middleware platforms for Wireless Sensor Networks (WSN) were proposed. Most of these platforms does not consider issues of how integrate components from generic middleware architectures. Many requirements need to be considered in a middleware design for WSN and the design, in this case, it is possibility to modify the source code of the middleware without changing the external behavior of the middleware. Thus, it is desired that there is a middleware generic architecture that is able to offer an optimal configuration according to the requirements of the application. The adoption of middleware based in component model consists of a promising approach because it allows a better abstraction, low coupling, modularization and management features built-in middleware. Another problem present in current middleware consists of treatment of interoperability with external networks to sensor networks, such as Web. Most current middleware lacks the functionality to access the data provided by the WSN via the World Wide Web in order to treat these data as Web resources, and they can be accessed through protocols already adopted the World Wide Web. Thus, this work presents the Midgard, a component-based middleware specifically designed for WSNs, which adopts the architectural patterns microkernel and REST. The microkernel architectural complements the component model, since microkernel can be understood as a component that encapsulates the core system and it is responsible for initializing the core services only when needed, as well as remove them when are no more needed. Already REST defines a standardized way of communication between different applications based on standards adopted by the Web and enables him to treat WSN data as web resources, allowing them to be accessed through protocol already adopted in the World Wide Web. The main goals of Midgard are: (i) to provide easy Web access to data generated by WSN, exposing such data as Web resources, following the principles of Web of Things paradigm and (ii) to provide WSN application developer with capabilities to instantiate only specific services required by the application, thus generating a customized middleware and saving node resources. The Midgard allows use the WSN as Web resources and still provide a cohesive and weakly coupled software architecture, addressing interoperability and customization. In addition, Midgard provides two services needed for most WSN applications: (i) configuration and (ii) inspection and adaptation services. New services can be implemented by others and easily incorporated into the middleware, because of its flexible and extensible architecture. According to the assessment, the Midgard provides interoperability between the WSN and external networks, such as web, as well as between different applications within a single WSN. In addition, we assessed the memory consumption, the application image size, the size of messages exchanged in the network, and response time, overhead and scalability on Midgard. During the evaluation, the Midgard proved satisfies their goals and shown to be scalable without consuming resources prohibitively

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation aims at extending the JCircus tool, a translator of formal specifications into code that receives a Circus specification as input, and translates the specification into Java code. Circus is a formal language whose syntax is based on Z s and CSP s syntax. JCircus generated code uses JCSP, which is a Java API that implements CSP primitives. As JCSP does not implement all CSP s primitives, the translation strategy from Circus to Java is not trivial. Some CSP primitives, like parallelism, external choice, communication and multi-synchronization are partially implemented. As an aditional scope, this dissertation will also develop a tool for testing JCSP programs, called JCSPUnit, which will also be included in JCircus new version. The extended version of JCircus will be called JCircus 2.0.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advance of the Cloud Computing paradigm, a single service offered by a cloud platform may not be enough to meet all the application requirements. To fulfill such requirements, it may be necessary, instead of a single service, a composition of services that aggregates services provided by different cloud platforms. In order to generate aggregated value for the user, this composition of services provided by several Cloud Computing platforms requires a solution in terms of platforms integration, which encompasses the manipulation of a wide number of noninteroperable APIs and protocols from different platform vendors. In this scenario, this work presents Cloud Integrator, a middleware platform for composing services provided by different Cloud Computing platforms. Besides providing an environment that facilitates the development and execution of applications that use such services, Cloud Integrator works as a mediator by providing mechanisms for building applications through composition and selection of semantic Web services that take into account metadata about the services, such as QoS (Quality of Service), prices, etc. Moreover, the proposed middleware platform provides an adaptation mechanism that can be triggered in case of failure or quality degradation of one or more services used by the running application in order to ensure its quality and availability. In this work, through a case study that consists of an application that use services provided by different cloud platforms, Cloud Integrator is evaluated in terms of the efficiency of the performed service composition, selection and adaptation processes, as well as the potential of using this middleware in heterogeneous computational clouds scenarios

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web services are software units that allow access to one or more resources, supporting the deployment of business processes in the Web. They use well-defined interfaces, using web standard protocols, making possible the communication between entities implemented on different platforms. Due to these features, Web services can be integrated as services compositions to form more robust loose coupling applications. Web services are subject to failures, unwanted situations that may compromise the business process partially or completely. Failures can occur both in the design of compositions as in the execution of compositions. As a result, it is essential to create mechanisms to make the implementation of service compositions more robust and to treat failures. Specifically, we propose the support for fault recovery in service compositions described in PEWS language and executed on PEWS-AM, an graph reduction machine. To support recovery failure on PEWS-AM, we extend the PEWS language specification and adapted the rules of translation and reduction of graphs for this machine. These contributions were made both in the model of abstract machine as at the implementation level

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the constantly increasing use of wireless networks in domestic, business and industrial environments, new challenges have emerged. The prototyping of new protocols in these environments is typically restricted to simulation environments, where there is the need of double implementation, one in the simulation environment where an initial proof of concept is performed and the other one in a real environment. Also, if real environments are used, it is not trivial to create a testbed for high density wireless networks given the need to use various real equipment as well as attenuators and power reducers to try to reduce the physical space required to create these laboratories. In this context, LVWNet (Linux Virtual Wireless Network) project was originally designed to create completely virtual testbeds for IEEE 802.11 networks on the Linux operating system. This paper aims to extend the current project LVWNet, adding to it the features like the ability to interact with real wireless hardware, provides a initial mobility ability using the positioning of the nodes in a space coordinates environment based on meters, with loss calculations due to attenuation in free space, enables some scalability increase by creating an own protocol that allows the communication between nodes without an intermediate host and dynamic registration of nodes, allowing new nodes to be inserted into in already in operation network

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivo: Analisar os efeitos das soluções de aspirina e de ácido acético, in vivo, em fígado de coelhos sadios, verificando o efeito histolítico e o resultado anatomo-patológico das lesões e eventuais alterações bioquímicas hepáticas. Métodos: Utilizou-se 80 coelhos, divididos em 2 protocolos experimentais (1 e 2), subdivididos em 5 grupos cada, sendo os mesmos submetidos a laparotomia mediana, com injeção de 0,4 ml da solução de aspirina (2,5 e 5,0%), de ácido acético (2,5 e 5,0%) e solução salina; o sacrifício ocorreu apos 24 horas (protocolo1) e 14 dias (protocolo 2); avaliou-se o peso, evolução clinica, dosagens bioquímicas, cavidade abdominal e torácica e microscopia do fígado. Resultados: Não foram observadas alterações na evolução clinica, peso e nas dosagens bioquímicas, apenas elevação da AST e ALT no grupo 24 horas(Protocolo 1). À macroscópica mostrou que nos animais tratados, em ambos os grupos, a presença de lesão hepática localizada na área infiltrada, correspondente a necrose (24 horas) e fibrose (14 dias). Conclusão: Ambas as soluções (aspirina e ácido acético) acarretaram destruição localizada do órgão substituída por fibrose apos 14 dias.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVO: Avaliação de condições dos equipamentos e dosimetria em setores de tomografia computadorizada utilizando protocolos de cabeça, abdome e coluna lombar em pacientes adultos (em três equipamentos distintos) e pediátricos com até um ano e meio de vida (em um dos equipamentos avaliados). MATERIAIS E MÉTODOS: Foram estimados o índice de dose em tomografia computadorizada e a dose média em cortes múltiplos, em exames com pacientes adultos, em três distintos equipamentos. Ainda foram estimadas as doses na superfície de entrada e as doses absorvidas em exame de cabeça para pacientes adultos e pediátricos em um dos equipamentos avaliados. RESULTADOS: Foram realizados testes de controle de qualidade, mecânicos, demonstrando que os equipamentos satisfazem as especificações de uso estabelecidas pelas normas vigentes. Os resultados da dosimetria mostraram que valores de dose média em cortes múltiplos excederam em até 109,0% os valores de níveis de referência, apresentando consideráveis variações entre os equipamentos avaliados neste estudo. As doses absorvidas obtidas com protocolos pediátricos são inferiores aos de pacientes adultos, apresentando redução de até 51,0% na tireoide. CONCLUSÃO: Neste estudo foram avaliadas as condições de operação de três equipamentos tomográficos, estabelecendo quais parâmetros devem ser trabalhados para a implantação de um programa de controle de qualidade nas instituições onde esta pesquisa foi desenvolvida.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUÇÃO: Os métodos de genotipagem do vírus da hepatite C têm sido muito discutidos. O objetivo deste trabalho foi comparar as metodologias de hibridização reversa e sequenciamento direto para a genotipagem do vírus da hepatite C. MÉTODOS: Noventa e uma amostras de plasma de pacientes assistidos na Faculdade de Medicina de Botucatu da Universidade Estadual Paulista foram utilizadas. A genotipagem por hibridização reversa foi realizada utilizando o kit comercial INNO-LiPA® v.1.0. O sequenciamento direto foi efetuado em sequenciador automático utilizando protocolos in house. RESULTADOS: A genotipagem por sequenciamento direto mostrou-se eficiente na resolução dos resultados inconclusivos pelo kit comercial. O kit mostrou resultados errôneos em relação à subtipagem viral. Além disso, a genotipagem por sequenciamento direto revelou um erro do kit com relação à determinação genotípica questionando a eficiência do método também para a identificação do genótipo viral. CONCLUSÕES: A genotipagem realizada por meio de sequenciamento direto permite uma maior acurácia na classificação viral quando comparada à hibridização reversa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O crescente interesse de crianças e adolescentes por esportes competitivos induz a uma maior preocupação em prescrever treinamentos adequados a essa população específica. O conhecimento do impacto da intensidade de treinamento físico competitivo sobre a saúde de adolescentes são ainda incipientes na literatura científica. Este estudo objetivou investigar as respostas agudas do lactato sanguíneo (Lac) e da creatinofosfoquinase (CPK) após uma sessão de treinamento físico em atletas jovens treinados em diferentes modalidades esportivas. Participaram 43 adolescentes do sexo masculino entre 9 e 17 anos, distribuídos em três grupos: nadadores, tenistas, jogadores de futebol de salão. Os protocolos para cada modalidade seguiram o planejamento normal de uma sessão específica. A dieta no dia anterior à coleta foi padronizada e as 24 horas que precediam a avaliação foram dedicadas ao repouso dos atletas. Foram coletados 5ml de sangue da veia antecubital imediatamente antes da realização da sessão de treinamento, repetindo coleta idêntica, imediatamente após a sessão. Foi obtida a idade óssea para a avaliação da maturação esquelética pelo método de Greulich & Pyle. Análise de variância Kruskal-Wallis e teste U-de Mann-Whitney foram utilizados para comparações entre os grupos. Valores de p < 0,05 foram considerados significantes. em todas as modalidades esportivas estudadas os valores prévios do Lac e da CPK foram inferiores aos valores pós-sessão. Com relação aos três grupos de faixas etárias, tanto para o Lac como para a CPK, os valores na pré-sessão foram inferiores aos obtidos na pós-sessão, assinalando que para o grupo de 9 aos 11 anos, tanto na situação pré como na pós, os valores foram inferiores aos observados nos grupos de idade superior. Os resultados indicaram que a magnitude de aumento da CPK e do Lac foram similares aos valores encontrados na literatura e demonstraram um aumento em função da faixa etária, indicando ser maturação biológica dependente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CONTEXTO E OBJETIVO: A falta de consenso sobre os protocolos de rastreamento e diagnóstico do diabetes gestacional, associada às dificuldades na realização do teste oral simplificado do diabete gestacional (o teste de tolerância a 100 g de glicose, considerado padrão-ouro) justificam a comparação com alternativas. O objetivo deste trabalho é comparar o teste padrão-ouro a dois testes de rastreamento: associação de glicemia de jejum e fatores de risco (GJ + FR) e o teste oral simplificado de tolerância a 50 g de glicose (TTG 50 g), com o teste de tolerância a 100 g de glicose (TTG 100 g). TIPO DE ESTUDO E LOCAL: Estudo de coorte longitudinal, prospectivo, realizado no Serviço de Ginecologia e Obstetrícia do Hospital Universitário da Universidade Federal de Mato Grosso do Sul. MÉTODOS: 341 gestantes foram submetidas aos três testes. Calcularam-se os índices de sensibilidade (S), especificidade (E), valores preditivos (VPP e VPN), razões de probabilidade (RPP e RPN) e resultados falsos (FP e FN), positivos e negativos da associação GJ + FR e do TTG 50 g em relação ao TTG 100 g. Compararam-se as médias das glicemias de uma hora pós-sobrecarga (1hPS) com 50 e 100 g. Na análise estatística, empregou-se o teste t de Student, com limite de significância de 5%. RESULTADOS: A associação GJ + FR encaminhou mais gestantes (53,9%) para a confirmação diagnóstica que o TTG 50 g (14,4%). Os dois testes foram equivalentes nos índices de S (86,4 e 76,9%), VPN (98,7 e 98,9%), RPN (0,3 e 0,27) e FN (15,4 e 23,1%). As médias das glicemias 1hPS foram semelhantes, 106,8 mg/dl para o TTG 50 g e 107,5 mg/dl para o TTG 100 g. CONCLUSÕES: Os resultados da eficiência diagnóstica associados à simplicidade, praticabilidade e custo referendaram a associação GJ + FR como o mais adequado para o rastreamento. A equivalência das glicemias de 1hPS permitiram a proposição de um novo protocolo de rastreamento e diagnóstico do diabete gestacional, com menores custo e desconforto.