954 resultados para Interoperability Protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interoperability is a crucial issue for electronic government due to the need of agencies' information systems to be totally integrated and able to exchange data in a seamless way. A way to achieve it is by establishing a government interoperability framework (GIF). However, this is a difficult task to be carried out due not only to technological issues but also to other aspects. This research is expected to contribute to the identification of the barriers to the adoption of interoperability standards for electronic government. The article presents the preliminary findings from a case study of the Brazilian Government framework (e-PING), based on the analyses of documents and face-to-face interviews. It points out some aspects that may influence the establishment of these standards, becoming barriers to their adoption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: Primary and secondary stabilities of immediately loaded mandibular implants restored with fixed prostheses (FP) using rigid or semirigid splinting systems were clinically and radiographically evaluated. Methods: Fifteen edentulous patients were rehabilitated using hybrid FP; each had 5 implants placed between the mental foramens. Two groups were randomly divided: group 1-FP with the conventional rigid bar splinting the implants and group 2-semi-rigid cantilever extension system with titanium bars placed in the 2 distal abutment cylinders. Primary stability was evaluated using resonance frequency analysis after installation of the implant abutments. The measurements were made at 3 times: T0, at baseline; T1, 4 months after implant placement; and T2, 8 months after implant placement. Presence of mobility and inflammation in the implant surrounding regions were checked. Stability data were submitted to statistical analysis for comparison between groups (P, 0.05). Results: Implant survival rate for the implants was of 100% in both groups. No significant differences in the mean implant stability quotient values were found for both groups from baseline and after the 8-month follow-up. Conclusion: The immediate loading of the implants was satisfactory, and both splinting conditions (rigid and semi-rigid) can be successfully used for the restoration of edentulous mandibles. (Implant Dent 2012;21:486-490)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to evaluate the efficacy of QMiX, SmearClear, and 17% EDTA for the debris and smear layer removal from the root canal and its effects on the push-out bond strength of an epoxy-based sealer by scanning electron microscopy (SEM). Forty extracted human canines (n = 10) were assigned to the following final rinse protocols: G1-distilled water (control), G2–17% EDTA, G3-SmearClear, and G4-QMiX. The specimens were submitted to a SEM analysis to evaluate the presence of debris and smear layer, respectively, in the apical or cervical segments. In sequence, forty extracted human maxillary canines with the root canals instrumented were divided into four groups (n = 10) similar to the SEM analysis study. After the filling with AH Plus, the roots were transversally sectioned to obtain dentinal slices. The specimens were submitted to a push-out bond strength test using an electromechanical testing machine. The statistical analysis for the SEM and push-out bond strength studies were performed using the Kruskal–Wallis and Dunn tests (α = 5%). There was no difference among the G2, G3, and G4 efficacy in removing the debris and smear layer (P > 0.05). The efficacy of these groups was superior to the control group. The push-out bond strength values of G2, G3, and G4 were superior to the control group. The ability to remove the debris and smear layer by SmearClear and QMiX was as effective as the 17% EDTA. The final rinse with these solutions promoted similar push-out bond strength values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared dentine demineralization induced by in vitro and in situ models, and correlated dentine surface hardness (SH), cross-sectional hardness (CSH) and mineral content by transverse microradiography (TMR). Bovine dentine specimens (n = 15/group) were demineralized in vitro with the following: MC gel (6% carboxymethylcellulose gel and 0.1 m lactic acid, pH 5.0, 14 days); buffer I (0.05 m acetic acid solution with calcium, phosphate and fluoride, pH 4.5, 7 days); buffer II (0.05 m acetic acid solution with calcium and phosphate, pH 5.0, 7 days), and TEMDP (0.05 m lactic acid with calcium, phosphate and tetraethyl methyl diphosphonate, pH 5.0, 7 days). In an in situ study, 11 volunteers wore palatal appliances containing 2 bovine dentine specimens, protected with a plastic mesh to allow biofilm development. The volunteers dripped a 20% sucrose solution on each specimen 4 times a day for 14 days. In vitro and in situ lesions were analyzed using TMR and statistically compared by ANOVA. TMR and CSH/SH were submitted to regression and correlation analysis (p < 0.05). The in situ model produced a deep lesion with a high R value, but with a thin surface layer. Regarding the in vitro models, MC gel produced only a shallow lesion, while buffers I and II as well as TEMDP induced a pronounced subsurface lesion with deep demineralization. The relationship between CSH and TMR was weak and not linear. The artificial dentine carious lesions induced by the different models differed significantly, which in turn might influence further de- and remineralization processes. Hardness analysis should not be interpreted with respect to dentine mineral loss

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to test different protocols for the extraction of microbial DNA from the coral Mussismilia harttii. Four different commercial kits were tested, three of them based on methods for DNA extraction from soil (FastDNA SPIN Kit for soil, MP Bio, PowerSoil DNA Isolation Kit, MoBio, and ZR Soil Microbe DNA Kit, Zymo Research) and one kit for DNA extraction from plants (UltraClean Plant DNA Isolation Kit, MoBio). Five polyps of the same colony of M. harttii were macerated and aliquots were submitted to DNA extraction by the different kits. After extraction, the DNA was quantified and PCR-DGGE was used to study the molecular fingerprint of Bacteria and Eukarya. Among the four kits tested, the ZR Soil Microbe DNA Kit was the most efficient with respect to the amount of DNA extracted, yielding about three times more DNA than the other kits. Also, we observed a higher number and intensities of DGGE bands for both Bacteria and Eukarya with the same kit. Considering these results, we suggested that the ZR Soil Microbe DNA Kit is the best adapted for the study of the microbial communities of corals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to evaluate the effects of the PGF2˛treatment givenat the onset of a synchronization of ovulation protocol using a norgestomet (NORG) earimplant on ovarian follicular dynamics (Experiment 1) and pregnancy per AI (P/AI; Exper-iment 2) in cyclic (CL present) Bos indicus heifers. In Experiment 1, a total of 46 heiferswere presynchronized using two consecutive doses of PGF2˛12 days apart. At first dayof the synchronization protocol the heifers received implants containing 3 mg of NORGand 2 mg of estradiol benzoate (EB). At the same time, heifers were randomly assignedto receive 150 mg of d-cloprostenol (n = 23; PGF2˛) or no additional treatment (n = 23;Control). When the ear implants were removed 8 days later, all heifers received a PGF2˛treatment and 1 mg of EB was given 24 h later. The follicular diameter and interval toovulation were determined by transrectal ultrasonography. No effects of PGF2˛treat-ment on the diameter of the largest follicle present were observed at implant removal(PGF2˛= 9.8 ± 0.4 vs. Control = 10.0 ± 0.3 mm; P = 0.73) or after 24 h (PGF2˛= 11.1 ± 0.4 vs.Control = 11.0 ± 0.4 mm; P = 0.83). No differences in the time of ovulation after ear implantremoval (PGF2˛= 70.8 ± 1.2 vs. Control = 73.3 ± 0.9 h; P = 0.10) or in the ovulation rate(PGF2˛= 87.0 vs. Control = 82.6%; P = 0.64) between treatments were observed. In Experi-ment 2, 280 cyclic heifers were synchronized using the same experimental design describedabove (PGF2˛; n = 143 and Control; n = 137), at random day of the estrous cycle. All heifersreceived 300 IU of equine chorionic gonadotropin (eCG) and 0.5 mg of estradiol cypionate(as ovulatory stimulus) when the NORG ear implants were removed. Timed artificial insem-ination (TAI) was performed 48 h after implant removal and the pregnancy diagnosis wasconducted 30 days later. No effects on the P/AI due to PGF2˛treatment were observed(PGF2˛= 51.7 vs. Control = 57.7%; P = 0.29). In conclusion, PGF2˛treatment at the onset ofNORG-based protocols for the synchronization of ovulation did not alter the ovarian follic-ular responses or the P/AI in cyclic Bos indicus beef heifers synchronized for TAI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process for obtaining polypyrrole-2-carboxylic acid (PPY-2-COOH) films in acetonitrile was investigated using cyclic voltammetry, electrochemical quartz crystal microgravimetry (EQCM), and infrared spectroscopy (FTIR). Different potential ranges were applied during cyclic voltammetry experiments with the aim of obtaining films without and with the presence of controlled amounts of water added in acetonitrile. The FTIR spectra of the films have evidenced that cations and anions from the electrolyte solution were incorporated into the PPY-2-COOH structure, with a preferential adsorption of cations. After chemically immobilizing polyphenoloxidase (tyrosinase, PPO), PPY-2-COOH/PPO films were build for amperometric detection of catechol, establishing a linear limit of concentrations ranging from 5.0 x 10-4 to 2.5 x 10-2 mol L-1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The scale down of transistor technology allows microelectronics manufacturers such as Intel and IBM to build always more sophisticated systems on a single microchip. The classical interconnection solutions based on shared buses or direct connections between the modules of the chip are becoming obsolete as they struggle to sustain the increasing tight bandwidth and latency constraints that these systems demand. The most promising solution for the future chip interconnects are the Networks on Chip (NoC). NoCs are network composed by routers and channels used to inter- connect the different components installed on the single microchip. Examples of advanced processors based on NoC interconnects are the IBM Cell processor, composed by eight CPUs that is installed on the Sony Playstation III and the Intel Teraflops pro ject composed by 80 independent (simple) microprocessors. On chip integration is becoming popular not only in the Chip Multi Processor (CMP) research area but also in the wider and more heterogeneous world of Systems on Chip (SoC). SoC comprehend all the electronic devices that surround us such as cell-phones, smart-phones, house embedded systems, automotive systems, set-top boxes etc... SoC manufacturers such as ST Microelectronics , Samsung, Philips and also Universities such as Bologna University, M.I.T., Berkeley and more are all proposing proprietary frameworks based on NoC interconnects. These frameworks help engineers in the switch of design methodology and speed up the development of new NoC-based systems on chip. In this Thesis we propose an introduction of CMP and SoC interconnection networks. Then focusing on SoC systems we propose: • a detailed analysis based on simulation of the Spidergon NoC, a ST Microelectronics solution for SoC interconnects. The Spidergon NoC differs from many classical solutions inherited from the parallel computing world. Here we propose a detailed analysis of this NoC topology and routing algorithms. Furthermore we propose aEqualized a new routing algorithm designed to optimize the use of the resources of the network while also increasing its performance; • a methodology flow based on modified publicly available tools that combined can be used to design, model and analyze any kind of System on Chip; • a detailed analysis of a ST Microelectronics-proprietary transport-level protocol that the author of this Thesis helped developing; • a simulation-based comprehensive comparison of different network interface designs proposed by the author and the researchers at AST lab, in order to integrate shared-memory and message-passing based components on a single System on Chip; • a powerful and flexible solution to address the time closure exception issue in the design of synchronous Networks on Chip. Our solution is based on relay stations repeaters and allows to reduce the power and area demands of NoC interconnects while also reducing its buffer needs; • a solution to simplify the design of the NoC by also increasing their performance and reducing their power and area consumption. We propose to replace complex and slow virtual channel-based routers with multiple and flexible small Multi Plane ones. This solution allows us to reduce the area and power dissipation of any NoC while also increasing its performance especially when the resources are reduced. This Thesis has been written in collaboration with the Advanced System Technology laboratory in Grenoble France, and the Computer Science Department at Columbia University in the city of New York.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research performed during the PhD candidature was intended to evaluate the quality of white wines, as a function of the reduction in SO2 use during the first steps of the winemaking process. In order to investigate the mechanism and intensity of interactions occurring between lysozyme and the principal macro-components of musts and wines, a series of experiments on model wine solutions were undertaken, focusing attention on the polyphenols, SO2, oenological tannins, pectines, ethanol, and sugar components. In the second part of this research program, a series of conventional sulphite added vinifications were compared to vinifications in which sulphur dioxide was replaced by lysozyme and consequently define potential winemaking protocols suitable for the production of SO2-free wines. To reach the final goal, the technological performance of two selected yeast strains with a low aptitude to produce SO2 during fermentation were also evaluated. The data obtained suggested that the addition of lysozyme and oenological tannins during the alcoholic fermentation could represent a promising alternative to the use of sulphur dioxide and a reliable starting point for the production of SO2-free wines. The different vinification protocols studied influenced the composition of the volatile profile in wines at the end of the alcoholic fermentation, especially with regards to alcohols and ethyl esters also a consequence of the yeast’s response to the presence or absence of sulphites during fermentation, contributing in different ways to the sensory profiles of wines. In fact, the aminoacids analysis showed that lysozyme can affect the consumption of nitrogen as a function of the yeast strain used in fermentation. During the bottle storage, the evolution of volatile compounds is affected by the presence of SO2 and oenological tannins, confirming their positive role in scaveging oxygen and maintaining the amounts of esters over certain levels, avoiding a decline in the wine’s quality. Even though a natural decrease was found on phenolic profiles due to oxidation effects caused by the presence of oxygen dissolved in the medium during the storage period, the presence of SO2 together with tannins contrasted the decay of phenolic content at the end of the fermentation. Tannins also showed a central role in preserving the polyphenolic profile of wines during the storage period, confirming their antioxidant property, acting as reductants. Our study focused on the fundamental chemistry relevant to the oxidative phenolic spoilage of white wines has demonstrated the suitability of glutathione to inhibit the production of yellow xanthylium cation pigments generated from flavanols and glyoxylic acid at the concentration that it typically exists in wine. The ability of glutathione to bind glyoxylic acid rather than acetaldehyde may enable glutathione to be used as a ‘switch’ for glyoxylic acid-induced polymerisation mechanisms, as opposed to the equivalent acetaldehyde polymerisation, in processes such as microoxidation. Further research is required to assess the ability of glutathione to prevent xanthylium cation production during the in-situ production of glyoxylic acid and in the presence of sulphur dioxide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis was to describe the development of motion analysis protocols for applications on upper and lower limb extremities, by using inertial sensors-based systems. Inertial sensors-based systems are relatively recent. Knowledge and development of methods and algorithms for the use of such systems for clinical purposes is therefore limited if compared with stereophotogrammetry. However, their advantages in terms of low cost, portability, small size, are a valid reason to follow this direction. When developing motion analysis protocols based on inertial sensors, attention must be given to several aspects, like the accuracy of inertial sensors-based systems and their reliability. The need to develop specific algorithms/methods and software for using these systems for specific applications, is as much important as the development of motion analysis protocols based on them. For this reason, the goal of the 3-years research project described in this thesis was achieved first of all trying to correctly design the protocols based on inertial sensors, in terms of exploring and developing which features were suitable for the specific application of the protocols. The use of optoelectronic systems was necessary because they provided a gold standard and accurate measurement, which was used as a reference for the validation of the protocols based on inertial sensors. The protocols described in this thesis can be particularly helpful for rehabilitation centers in which the high cost of instrumentation or the limited working areas do not allow the use of stereophotogrammetry. Moreover, many applications requiring upper and lower limb motion analysis to be performed outside the laboratories will benefit from these protocols, for example performing gait analysis along the corridors. Out of the buildings, the condition of steady-state walking or the behavior of the prosthetic devices when encountering slopes or obstacles during walking can also be assessed. The application of inertial sensors on lower limb amputees presents conditions which are challenging for magnetometer-based systems, due to ferromagnetic material commonly adopted for the construction of idraulic components or motors. INAIL Prostheses Centre stimulated and, together with Xsens Technologies B.V. supported the development of additional methods for improving the accuracy of MTx in measuring the 3D kinematics for lower limb prostheses, with the results provided in this thesis. In the author’s opinion, this thesis and the motion analysis protocols based on inertial sensors here described, are a demonstration of how a strict collaboration between the industry, the clinical centers, the research laboratories, can improve the knowledge, exchange know-how, with the common goal to develop new application-oriented systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) consist of a large number of sensor nodes, characterized by low power constraint, limited transmission range and limited computational capabilities [1][2].The cost of these devices is constantly decreasing, making it possible to use a large number of sensor devices in a wide array of commercial, environmental, military, and healthcare fields. Some of these applications involve placing the sensors evenly spaced on a straight line for example in roads, bridges, tunnels, water catchments and water pipelines, city drainages, oil and gas pipelines etc., making a special class of these networks which we define as a Linear Wireless Network (LWN). In LWNs, data transmission happens hop by hop from the source to the destination, through a route composed of multiple relays. The peculiarity of the topology of LWNs, motivates the design of specialized protocols, taking advantage of the linearity of such networks, in order to increase reliability, communication efficiency, energy savings, network lifetime and to minimize the end-to-end delay [3]. In this thesis a novel contention based Medium Access Control (MAC) protocol called L-CSMA, specifically devised for LWNs is presented. The basic idea of L-CSMA is to assign different priorities to nodes based on their position along the line. The priority is assigned in terms of sensing duration, whereby nodes closer to the destination are assigned shorter sensing time compared to the rest of the nodes and hence higher priority. This mechanism speeds up the transmission of packets which are already in the path, making transmission flow more efficient. Using NS-3 simulator, the performance of L-CSMA in terms of packets success rate, that is, the percentage of packets that reach destination, and throughput are compared with that of IEEE 802.15.4 MAC protocol, de-facto standard for wireless sensor networks. In general, L-CSMA outperforms the IEEE 802.15.4 MAC protocol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation document deals with the development of a project, over a span of more than two years, carried out within the scope of the Arrowhead Framework and which bears my personal contribution in several sections. The final part of the project took place during a visiting period at the university of Luleå. The Arrowhead Project is an European project, belonging to the ARTEMIS association, which aims to foster new technologies and unify the access to them into an unique framework. Such technologies include the Internet of Things phe- nomenon, Smart Houses, Electrical Mobility and renewable energy production. An application is considered compliant with such framework when it respects the Service Oriented Architecture paradigm and it is able to interact with a set of defined components called Arrowhead Core Services. My personal contribution to this project is given by the development of several user-friendly API, published in the project's main repository, and the integration of a legacy system within the Arrowhead Framework. The implementation of this legacy system was initiated by me in 2012 and, after many improvements carried out by several developers in UniBO, it has been again significantly modified this year in order to achieve compatibility. The system consists of a simulation of an urban scenario where a certain amount of electrical vehicles are traveling along their specified routes. The vehicles are con-suming their battery and, thus, need to recharge at the charging stations. The electrical vehicles need to use a reservation mechanism to be able to recharge and avoid waiting lines, due to the long recharge process. The integration with the above mentioned framework consists in the publication of the services that the system provides to the end users through the instantiation of several Arrowhead Service Producers, together with a demo Arrowhead- compliant client application able to consume such services.