990 resultados para Transport-protocol selection
Resumo:
Concerns have been raised in the past several years that introducing new transport protocols on the Internet has be- come increasingly difficult, not least because there is no agreed-upon way for a source end host to find out if a trans- port protocol is supported all the way to a destination peer. A solution to a similar problem—finding out support for IPv6—has been proposed and is currently being deployed: the Happy Eyeballs (HE) mechanism. HE has also been proposed as an efficient way for an application to select an appropriate transport protocol. Still, there are few, if any, performance evaluations of transport HE. This paper demonstrates that transport HE could indeed be a feasible solution to the transport support problem. The paper evaluates HE between TCP and SCTP using TLS encrypted and unencrypted traffic, and shows that although there is indeed a cost in terms of CPU load to introduce HE, the cost is rel- atively small, especially in comparison with the cost of using TLS encryption. Moreover, our results suggest that HE has a marginal impact on memory usage. Finally, by introduc- ing caching of previous connection attempts, the additional cost of transport HE could be significantly reduced.
Resumo:
Intermodal rail/road freight transport constitutes an alternative to long-haul road transport for the distribution of large volumes of goods. The paper introduces the intermodal transportation problem for the tactical planning of mode and service selection. In rail mode, shippers either book train capacity on a per-unit basis or charter block trains completely. Road mode is used for short-distance haulage to intermodal terminals and for direct shipments to customers. We analyze the competition of road and intermodal transportation with regard to freight consolidation and service cost on a model basis. The approach is applied to a distribution system of an industrial company serving customers in eastern Europe. The case study investigates the impact of transport cost and consolidation on the optimal modal split.
Resumo:
This work is dedicated to comparison of open source as well as proprietary transport protocols for highspeed data transmission via IP networks. The contemporary common TCP needs significant improvement since it was developed as general-purpose transport protocol and firstly introduced four decades ago. In nowadays networks, TCP fits not all communication needs that society has. Caused of it another transport protocols have been developed and successfully used for e.g. Big Data movement. In scope of this research the following protocols have been investigated for its efficiency on 10Gbps links: UDT, RBUDP, MTP and RWTP. The protocols were tested under different impairments such as Round Trip Time up to 400 ms and packet losses up to 2%. Investigated parameters are the data rate under different conditions of the network, the CPU load by sender andreceiver during the experiments, size of feedback data, CPU usage per Gbps and the amount of feedback data per GiByte of effectively transmitted data. The best performance and fair resources consumption was observed by RWTP. From the opensource projects, the best behavior is showed by RBUDP.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
INTRODUCTION: Ultra-high-field whole-body systems (7.0 T) have a high potential for future human in vivo magnetic resonance imaging (MRI). In musculoskeletal MRI, biochemical imaging of articular cartilage may benefit, in particular. Delayed gadolinium-enhanced MRI of cartilage (dGEMRIC) and T2 mapping have shown potential at 3.0 T. Although dGEMRIC, allows the determination of the glycosaminoglycan content of articular cartilage, T2 mapping is a promising tool for the evaluation of water and collagen content. In addition, the evaluation of zonal variation, based on tissue anisotropy, provides an indicator of the nature of cartilage ie, hyaline or hyaline-like articular cartilage.Thus, the aim of our study was to show the feasibility of in vivo dGEMRIC, and T2 and T2* relaxation measurements, at 7.0 T MRI; and to evaluate the potential of T2 and T2* measurements in an initial patient study after matrix-associated autologous chondrocyte transplantation (MACT) in the knee. MATERIALS AND METHODS: MRI was performed on a whole-body 7.0 T MR scanner using a dedicated circular polarization knee coil. The protocol consisted of an inversion recovery sequence for dGEMRIC, a multiecho spin-echo sequence for standard T2 mapping, a gradient-echo sequence for T2* mapping and a morphologic PD SPACE sequence. Twelve healthy volunteers (mean age, 26.7 +/- 3.4 years) and 4 patients (mean age, 38.0 +/- 14.0 years) were enrolled 29.5 +/- 15.1 months after MACT. For dGEMRIC, 5 healthy volunteers (mean age, 32.4 +/- 11.2 years) were included. T1 maps were calculated using a nonlinear, 2-parameter, least squares fit analysis. Using a region-of-interest analysis, mean cartilage relaxation rate was determined as T1 (0) for precontrast measurements and T1 (Gd) for postcontrast gadopentate dimeglumine [Gd-DTPA(2-)] measurements. T2 and T2* maps were obtained using a pixelwise, monoexponential, non-negative least squares fit analysis; region-of-interest analysis was carried out for deep and superficial cartilage aspects. Statistical evaluation was performed by analyses of variance. RESULTS: Mean T1 (dGEMRIC) values for healthy volunteers showed slightly different results for femoral [T1 (0): 1259 +/- 277 ms; T1 (Gd): 683 +/- 141 ms] compared with tibial cartilage [T1 (0): 1093 +/- 281 ms; T1 (Gd): 769 +/- 150 ms]. Global mean T2 relaxation for healthy volunteers showed comparable results for femoral (T2: 56.3 +/- 15.2 ms; T2*: 19.7 +/- 6.4 ms) and patellar (T2: 54.6 +/- 13.0 ms; T2*: 19.6 +/- 5.2 ms) cartilage, but lower values for tibial cartilage (T2: 43.6 +/- 8.5 ms; T2*: 16.6 +/- 5.6 ms). All healthy cartilage sites showed a significant increase from deep to superficial cartilage (P < 0.001). Within healthy cartilage sites in MACT patients, adequate values could be found for T2 (56.6 +/- 13.2 ms) and T2* (18.6 +/- 5.3 ms), which also showed a significant stratification. Within cartilage repair tissue, global mean values showed no difference, with 55.9 +/- 4.9 ms for T2 and 16.2 +/- 6.3 ms for T2*. However, zonal assessment showed only a slight and not significant increase from deep to superficial cartilage (T2: P = 0.174; T2*: P = 0.150). CONCLUSION: In vivo T1 dGEMRIC assessment in healthy cartilage, and T2 and T2* mapping in healthy and reparative articular cartilage, seems to be possible at 7.0 T MRI. For T2 and T2*, zonal variation of articular cartilage could also be evaluated at 7.0 T. This zonal assessment of deep and superficial cartilage aspects shows promising results for the differentiation of healthy and affected articular cartilage. In future studies, optimized protocol selection, and sophisticated coil technology, together with increased signal at ultra-high-field MRI, may lead to advanced biochemical cartilage imaging.
Resumo:
Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.
Resumo:
Sarcoptic mange is a highly contagious skin disease that can have a devastating impact on affected wild mammal populations. There are notable variations in the clinical and pathologic picture of sarcoptic mange among species and among conspecifics. However, the origin of these variations is unclear. We propose a classification scheme for skin lesions associated with Sarcoptes scabiei infestation to provide a basis for a subsequent risk factor analysis. We conducted a case-control study focused on macroscopic and histologic examination of the skin, using 279 red foxes (Vulpes vulpes) found dead or shot in Switzerland between November 2004 and February 2006. All animals were submitted to gross necropsy following a detailed protocol. Selection criteria for cases (n=147) vs. controls (n=111) were the presence or absence of mange-like lesions, mite detection by isolation or histologic examination, and serologic testing for S. scabiei antibodies. Characteristic features of mange lesions were scored macroscopically in all foxes and histologically in 67 cases and 15 controls. We classified skin lesions and associated necropsy findings into three types of mange: A) early stage (n=45): focal-extensive skin lesions, thin crusts, mild to moderate alopecia, few mites, numerous eosinophils, and mild lymph node enlargement; B) hyperkeratotic, fatal form (n=86): generalized skin lesions, thick crusts with or without alopecia, foul odor, abundance of mites, numerous bacteria and yeasts, numerous lymphocytes and mast cells, severe lymph node enlargement, and emaciation; C) alopecic, healing form (n=16): focal lesions, no crusts, severe alopecia, hyperpigmentation and lichenification, absence of mites, mixed cell infiltration, and rare mild lymph node enlargement. We hypothesize that after stage A, the animal either enters stage B and dies, or stage C and survives, depending on largely unknown extrinsic or intrinsic factors affecting the host ability to control mite infestation.
Resumo:
En los últimos años, debido al notable desarrollo de los terminales portátiles, que han pasado de ser “simples” teléfonos o reproductores a puros ordenadores, ha crecido el número de servicios que ofrecen cada vez mayor cantidad de contenido multimedia a través de internet. Además, la distinta evolución de estos terminales hace que nos encontremos en el mercado con una amplísima gama de productos de diferentes tamaños y capacidades de procesamiento, lo que hace necesario encontrar una fórmula que permita satisfacer la demanda de dichos servicios sea cual sea la naturaleza de nuestro dispositivo. Para poder ofrecer una solución adecuada se ha optado por la integración de un protocolo como RTP y un estándar de video como SVC. RTP (Real-time Transport Protocol), en contraposición a los protocolos de propósito general fue diseñado para aplicaciones de tiempo real por lo que es ideal para el streaming de contenido multimedia. Por su parte, SVC es un estándar de video escalable que permite transmitir en un mismo stream una capa base y múltiples capas de mejora, por lo que podremos adaptar la calidad y tamaño del contenido a la capacidad y tamaño de nuestro dispositivo. El objetivo de este proyecto consiste en integrar y modificar tanto el reproductor MPlayer como la librería RTP live555 de tal forma que sean capaces de soportar el formato SVC sobre el protocolo RTP y montar un sistema servidorcliente para comprobar su funcionamiento. Aunque este proceso esté orientado a llevarse a cabo en un dispositivo móvil, para este proyecto se ha optado por realizarlo en el escenario más sencillo posible, para lo cual, se emitirán secuencias a una máquina virtual alojada en el mismo ordenador que el servidor. ABSTRACT In recent years, due to the remarkable development of mobile devices, which have evolved from "simple" phones or players to computers, the amount of services that offer multimedia content over the internet have shot up. Furthermore, the different evolution of these terminals causes that we can find in the market a wide range of different sizes and processing capabilities, making necessary to find a formula that will satisfy the demand for such services regardless of the nature of our device. In order to provide a suitable solution we have chosen to integrate a protocol as RTP and a video standard as SVC. RTP (Real-time Transport Protocol), in opposition to general purpose protocols was designed for real-time applications making it ideal for media streaming. Meanwhile, SVC is a scalable video standard which can transmit a single stream in a base layer and multiple enhancement layers, so that we can adapt the quality and size of the content to the capacity and size of our device. The objective of this project is to integrate and modify both MPlayer and RTP library live555 so that they support the SVC format over RTP protocol and set up a client-server system to check its behavior. Although this process has been designed to be done on a mobile device, for this project we have chosen to do it in the simplest possible scenario so we will stream to a virtual machine hosted on the same computer where we have the server.
Resumo:
Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.
Resumo:
Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.
Resumo:
Diplomityön tavoitteena oli kehittää kolmannen sukupolven fyysistä protokollakerrosta matkapuhelimen ohjelmistoarkkitehtuurille. Kolmannen sukupolven matkapuhelinjärjestelmät ovat aikaisempia järjestelmiä monimutkaisempia. Ohjelmiston koon ja monimutkaisuuden sekä aikataulujen kiireellisyyden vuoksi on tullut tarve ottaa käyttöön formaaleja menetelmiä ohjelmiston kehitystyöhön. Formaalit kuvauskielet mahdollistavat tarkan, yksiselitteisen ja simuloitavissa olevan järjestelmäkuvauksen muodostamisen. Fyysinen protokollakerros tarjoaa tiedon siirtoa ylemmille protokollakerroksille. Tämän tiedonsiirron hallinta vaatii protokollakerrosten välistä viestinvälitystä. Formaaleja kuvauskieliä käyttämällä voidaan viestinvälityksen toteutusta automatisoida ja siinä tarvittavaa logiikkaa havainnollistaa. Työssä suunniteltiin, toteutettiin ja testattiin ylempien protokollakerrosten kanssa kommunikoivaa osaa fyysisestä protokollakerroksesta. Tuloksena saatiin solunvalintatoiminnallisuuden vaatiman kommunikoinnin ja tilakoneen toteutus ohjelmistoarkkitehtuurissa. Ohjelmistonkehityksen alkuvaiheiden havaittiin olevan fyysisen kerroksen suorituskyvyn kannalta merkittävässä asemassa, koska tällöin viestinvälityksen optimointi on helpointa. Formaalit kuvauskielet eivät ole sellaisenaan täysin soveltuvia tarkoin määritellyn ohjelmistoarkkitehtuurin osien kehitykseen.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Osteoarticular allograft is one possible treatment in wide surgical resections with large defects. Performing best osteoarticular allograft selection is of great relevance for optimal exploitation of the bone databank, good surgery outcome and patient’s recovery. Current approaches are, however, very time consuming hindering these points in practice. We present a validation study of a software able to perform automatic bone measurements used to automatically assess the distal femur sizes across a databank. 170 distal femur surfaces were reconstructed from CT data and measured manually using a size measure protocol taking into account the transepicondyler distance (A), anterior-posterior distance in medial condyle (B) and anterior-posterior distance in lateral condyle (C). Intra- and inter-observer studies were conducted and regarded as ground truth measurements. Manual and automatic measures were compared. For the automatic measurements, the correlation coefficients between observer one and automatic method, were of 0.99 for A measure and 0.96 for B and C measures. The average time needed to perform the measurements was of 16 h for both manual measurements, and of 3 min for the automatic method. Results demonstrate the high reliability and, most importantly, high repeatability of the proposed approach, and considerable speed-up on the planning.