987 resultados para Distributed Network Protocol version 3 (DNP3)
Resumo:
Ocean acidification (OA), induced by rapid anthropogenic CO2 rise and its dissolution in seawater, is known to have consequences for marine organisms. However, knowledge on the evolutionary responses of phytoplankton to OA has been poorly studied. Here we examined the coccolithophore Gephyrocapsa oceanica, while growing it for 2000 generations under ambient and elevated CO2 levels. While OA stimulated growth in the earlier selection period (from generations 700 to 1550), it reduced it in the later selection period up to 2000 generations. Similarly, stimulated production of particulate organic carbon and nitrogen reduced with increasing selection period and decreased under OA up to 2000 generations. The specific adaptation of growth to OA disappeared in generations 1700 to 2000 when compared with that at 1000 generations. Both phenotypic plasticity and fitness decreased within selection time, suggesting that the species' resilience to OA decreased after 2000 generations under high CO2 selection.
Resumo:
La normalización facilita la comunicación y permite el intercambio de información con cualquier institución nacional o internacional. Este objetivo es posible a través de los formatos de comunicación para intercambio de información automatizada como CEPAL, MARC., FCC.La Escuela de Bibliotecología, Documentación e Información de la Universidad Nacional utiliza el software MICROISIS en red para la enseñanza. Las bases de datos que se diseñan utilizan el formato MARC y para la descripción bibliográfica las RCAA2.Se presenta la experiencia con la base de datos “I&D” sobre desarrollo rural, presentando la Tabla de Definición de Campos, la hoja de trabajo, el formato de despliegue y Tabla de selección de Campos.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
Purpose: To evaluate the diagnostic image quality of post-gadolinium water excitation-magnetization-prepared rapid gradient-echo (WE-MPRAGE) sequence in abdominal examinations of noncooperative patients at 1.5 Tesla (T) and 3.0T MRI. Materials and Methods: Eighty-nine consecutive patients (48 males and 41 females; mean age +/- standard deviation, 54.6 +/- 16.6 years) who had MRI examinations including postgadolinium WE-MPRAGE were included in the study. Of 89 patients, 33 underwent noncooperative protocol at 1.5T. 10 under-went noncooperative protocol at 3.0T, and 46 underwent cooperative protocol at 3.0T. Postgadolinium WE-MPRAGE, MPRAGE, and three-dimensional gradient-echo sequences of these three different groups were qualitatively evaluated for image quality, extent of artifacts, lesion conspicuity, and homogeneity of fat-attenuation by two reviewers retrospectively, independently, and blindly. The results were compared using Wilcoxon signed rank and Mann-Whitney U tests. Kappa statistics were used to measure the extent of agreement between the reviewers. Results: The average scores indicated that the images were diagnostic for WE-MPRAGE at 1.5T and 3.0T in noncooperative patients. WE-MPRAGE achieved homogenous fat-attenuation in 31/33 (94%) of noncooperative patients at 1.5T and 10/10 (100%) of noncooperative patients at 3.0T. WE-MPRAGE at 3.0T had better results for image quality, extent of artifacts, lesion conspicuity and homogeneity of fat-attenuation compared with WE-MPRAGE at 1.5T. in noncooperative patients (P = 0.0008, 0.0006, 0.0024, and 0.0042: respectively). Kappa statistics varied between 0.76 and 1.00, representing good to excellent agreement. Conclusion: WE-MPRAGE may be used as a T1-weighted postgadolinium fat-attenuated sequence in noncooperative patients, particularly at 3.0T MRI.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
A massificação da utilização das tecnologias de informação e da Internet para os mais variados fins, e nas mais diversas áreas, criou problemas de gestão das infra-estruturas de informática, ímpares até ao momento. A gestão de redes informáticas converteu-se num factor vital para uma rede a operar de forma eficiente, produtiva e lucrativa. No entanto, a maioria dos sistemas são baseados no Simple Network Management Protocol (SNMP), assente no modelo cliente-servidor e com um paradigma centralizado. Desta forma subsiste sempre um servidor central que colecta e analisa dados provenientes dos diferentes elementos dispersos pela rede. Sendo que os dados de gestão estão armazenados em bases de dados de gestão ou Management Information Bases (MIB’s) localizadas nos diversos elementos da rede. O actual modelo de gestão baseado no SNMP não tem conseguido dar a resposta exigida. Pelo que, existe a necessidade de se estudar e utilizar novos paradigmas de maneira a que se possa encontrar uma nova abordagem capaz de aumentar a fiabilidade e a performance da gestão de redes. Neste trabalho pretende-se discutir os problemas existentes na abordagem tradicional de gestão de redes, procurando demonstrar a utilidade e as vantagens da utilização de uma abordagem baseada em Agentes móveis. Paralelamente, propõe-se uma arquitectura baseada em Agentes móveis para um sistema de gestão a utilizar num caso real.
Resumo:
Remote engineering (also known as online engineering) may be defined as a combination of control engineering and telematics. In this area, specific activities require computacional skills in order to develop projects where electrical devives are monitored and / or controlled, in an intercative way, through a distributed network (e.g. Intranet or Internet). In our specific case, we will be dealing with an industrial plant. Within the last few years, there has been an increase in the number of activities related to remote engineering, which may be connected to the phenomenon of the large extension experienced by the Internet (e.g. bandwith, number of users, development tools, etc.). This increase opens new and future possibilities to the implementation of advance teleworking (or e-working) positions. In this paper we present the architecture for a remote application, accessible through the Internet, able to monitor and control a roller hearth kiln, used in a ceramics industry for firing materials. The proposed architecture is based on a micro web server, whose main function is to monitor and control the firing process, by reading the data from a series of temperature sensors and by controlling a series of electronic valves and servo motors. This solution is also intended to be a low-cost alternative to other potential solutions. The temperature readings are obtained through K-type thermopairs and the gas flow is controlled through electrovalves. As the firing process should not be stopped before its complete end, the system is equipped with a safety device for that specific purpose. For better understanding the system to be automated and its operation we decided to develop a scale model (100:1) and experiment on it the devised solution, based on a Micro Web Server.
Resumo:
Monitoring systems have traditionally been developed with rigid objectives and functionalities, and tied to specific languages, libraries and run-time environments. There is a need for more flexible monitoring systems which can be easily adapted to distinct requirements. On-line monitoring has been considered as increasingly important for observation and control of a distributed application. In this paper we discuss monitoring interfaces and architectures which support more extensible monitoring and control services. We describe our work on the development of a distributed monitoring infrastructure, and illustrate how it eases the implementation of a complex distributed debugging architecture. We also discuss several issues concerning support for tool interoperability and illustrate how the cooperation among multiple concurrent tools can ease the task of distributed debugging.
Resumo:
Uma nova área tecnológica está em crescente desenvolvimento. Esta área, denominada de internet das coisas, surge na necessidade de interligar vários objetos para uma melhoria a nível de serviços ou necessidades por parte dos utilizadores. Esta dissertação concentra-se numa área específica da tecnologia internet das coisas que é a sensorização. Esta rede de sensorização é implementada pelo projeto europeu denominado de Future Cities [1] onde se cria uma infraestrutura de investigação e validação de projetos e serviços inteligentes na cidade do Porto. O trabalho realizado nesta dissertação insere-se numa das plataformas existentes nessa rede de sensorização: a plataforma de sensores ambientais intitulada de UrbanSense. Estes sensores ambientais que estão incorporados em Data Collect Unit (DCU), também denominados por nós, medem variáveis ambientais tais como a temperatura, humidade, ozono e monóxido de carbono. No entanto, os nós têm recursos limitados em termos de energia, processamento e memória. Apesar das grandes evoluções a nível de armazenamento e de processamento, a nível energético, nomeadamente nas baterias, não existe ainda uma evolução tão notável, limitando a sua operacionalidade [2]. Esta tese foca-se, essencialmente, na melhoria do desempenho energético da rede de sensores UrbanSense. A principal contribuição é uma adaptação do protocolo de redes Ad Hoc OLSR (Optimized Link State Routing Protocol) para ser usado por nós alimentados a energia renovável, de forma a aumentar a vida útil dos nós da rede de sensorização. Com esta contribuição é possível obter um maior número de dados durante períodos de tempo mais longos, aproximadamente 10 horas relativamente às 7 horas anteriores, resultando numa maior recolha e envio dos mesmos com uma taxa superior, cerca de 500 KB/s. Existindo deste modo uma aproximação analítica dos vários parâmetros existentes na rede de sensorização. Contudo, o aumento do tempo de vida útil dos nós sensores com recurso à energia renovável, nomeadamente, energia solar, incrementa o seu peso e tamanho que limita a sua mobilidade. Com o referido acréscimo a determinar e a limitar a sua mobilidade exigindo, por isso, um planeamento prévio da sua localização. Numa primeira fase do trabalho analisou-se o consumo da DCU, visto serem estes a base na infraestrutura e comunicando entre si por WiFi ou 3G. Após uma análise dos protocolos de routing com iv suporte para parametrização energética, a escolha recaiu sobre o protocolo OLSR devido à maturidade e compatibilidade com o sistema atual da DCU, pois apesar de existirem outros protocolos, a implementação dos mesmos, não se encontram disponível como software aberto. Para a validação do trabalho realizado na presente dissertação, é realizado um ensaio prévio sem a energia renovável, para permitir caracterização de limitações do sistema. Com este ensaio, tornou-se possível verificar a compatibilidade entre os vários materiais e ajustamento de estratégias. Num segundo teste de validação é concretizado um ensaio real do sistema com 4 nós a comunicar, usando o protocolo com eficiência energética. O protocolo é avaliado em termos de aumento do tempo de vida útil do nó e da taxa de transferência. O desenvolvimento da análise e da adaptação do protocolo de rede Ad Hoc oferece uma maior longevidade em termos de tempo de vida útil, comparando ao que existe durante o processamento de envio de dados. Apesar do tempo de longevidade ser inferior, quando o parâmetro energético se encontra por omissão com o fator 3, a realização da adaptação do sistema conforme a energia, oferece uma taxa de transferência maior num período mais longo. Este é um fator favorável para a abertura de novos serviços de envio de dados em tempo real ou envio de ficheiros com um tamanho mais elevado.
Resumo:
Estudi comparatiu amb benchmark del rendiment en dues plataformes multicore multithreading de diferents modalitats de paral·lelització de multiplicacions de matrius de nombres enters i de nombres en coma flotant mitjançant el model de memòria compartida OpenMP versió 2.5 i OpenMP versió 3.0.
Resumo:
Memòria del projecte final de carrera que mostra un mapamundi on es representa el recorregut que realitzen els paquets de dades per arribar al seu destí.
Resumo:
Introduction: The Thalidomide-Dexamethasone (TD) regimen has provided encouraging results in relapsed MM. To improve results, bortezomib (Velcade) has been added to the combination in previous phase II studies, the so called VTD regimen. In January 2006, the European Group for Blood and Marrow Transplantation (EBMT) and the Intergroupe Francophone du Myélome (IFM) initiated a prospective, randomized, parallel-group, open-label phase III, multicenter study, comparing VTD (arm A) with TD (arm B) for MM patients progressing or relapsing after autologous transplantation. Patients and Methods: Inclusion criteria: patients in first progression or relapse after at least one autologous transplantation, including those who had received bortezomib or thalidomide before transplant. Exclusion criteria: subjects with neuropathy above grade 1 or non secretory MM. Primary study end point was time to progression (TTP). Secondary end points included safety, response rate, progression-free survival (PFS) and overall survival (OS). Treatment was scheduled as follows: bortezomib 1.3 mg/m2 was given as an i.v bolus on Days 1, 4, 8 and 11 followed by a 10-Day rest period (days 12 to 21) for 8 cycles (6 months) and then on Days 1, 8, 15, 22 followed by a 20-Day rest period (days 23 to 42) for 4 cycles (6 months). In both arms, thalidomide was scheduled at 200 mg/Day orally for one year and dexamethasone 40 mg/Day orally four days every three weeks for one year. Patients reaching remission could proceed to a new stem cell harvest. However, transplantation, either autologous or allogeneic, could only be performed in patients who completed the planned one year treatment period. Response was assessed by EBMT criteria, with additional category of near complete remission (nCR). Adverse events were graded by the NCI-CTCAE, Version 3.0.The trial was based on a group sequential design, with 4 planned interim analyses and one final analysis that allowed stopping for efficacy as well as futility. The overall alpha and power were set equal to 0.025 and 0.90 respectively. The test for decision making was based on the comparison in terms of the ratio of the cause-specific hazards of relapse/progression, estimated in a Cox model stratified on the number of previous autologous transplantations. Relapse/progression cumulative incidence was estimated using the proper nonparametric estimator, the comparison was done by the Gray test. PFS and OS probabilities were estimated by the Kaplan-Meier curves, the comparison was performed by the Log-Rank test. An interim safety analysis was performed when the first hundred patients had been included. The safety committee recommended to continue the trial. Results: As of 1st July 2010, 269 patients had been enrolled in the study, 139 in France (IFM 2005-04 study), 21 in Italy, 38 in Germany, 19 in Switzerland (a SAKK study), 23 in Belgium, 8 in Austria, 8 in the Czech republic, 11 in Hungary, 1 in the UK and 1 in Israel. One hundred and sixty nine patients were males and 100 females; the median age was 61 yrs (range 29-76). One hundred and thirty six patients were randomized to receive VTD and 133 to receive TD. The current analysis is based on 246 patients (124 in arm A, 122 in arm B) included in the second interim analysis, carried out when 134 events were observed. Following this analysis, the trial was stopped because of significant superiority of VTD over TD. The remaining patients were too premature to contribute to the analysis. The number of previous autologous transplants was one in 63 vs 60 and two or more in 61 vs 62 patients in arm A vs B respectively. The median follow-up was 25 months. The median TTP was 20 months vs 15 months respectively in arm A and B, with cumulative incidence of relapse/progression at 2 years equal to 52% (95% CI: 42%-64%) vs 70% (95% CI: 61%-81%) (p=0.0004, Gray test). The same superiority of arm A was also observed when stratifying on the number of previous autologous transplantations. At 2 years, PFS was 39% (95% CI: 30%-51%) vs 23% (95% CI: 16%-34%) (A vs B, p=0.0006, Log-Rank test). OS in the first two years was comparable in the two groups. Conclusion: VTD resulted in significantly longer TTP and PFS in patients relapsing after ASCT. Analysis of response and safety data are on going and results will be presented at the meeting. Protocol EU-DRACT number: 2005-001628-35.
Resumo:
Background: Panitumumab (pmab), a fully human monoclonal antibody against the epidermal growth factor receptor (EGFR), is indicated as monotherapy for treatment of metastatic colorectal cancer. This ongoing study is designed to assess the efficacy and safety of pmab in combination with radiotherapy (PRT) compared to chemoradiotherapy (CRT) as initial treatment of unresected, locally advanced SCCHN (ClinicalTrials.gov Identifier: NCT00547157). Methods: This is a phase 2, open-label, randomized, multicenter study. Eligible patients (pts) were randomized 2:3 to receive cisplatin 100 mg/m2 on days 1 and 22 of RT or pmab 9.0 mg/kg on days 1, 22, and 43. Accelerated RT (70 to 72 Gy − delivered over 6 to 6.5 weeks) was planned for all pts and was delivered either by intensity-modulated radiation therapy (IMRT) modality or by three-dimensional conformal (3D-CRT) modality. The primary endpoint is local-regional control (LRC) rate at 2 years. Key secondary endpoints include PFS, OS, and safety. An external, independent data monitoring committee conducts planned safety and efficacy reviews during the course of the trial. Results: Pooled data from this planned interim safety analysis includes the first 52 of the 150 planned pts; 44 (84.6%) are male; median (range) age is 57 (33−77) years; ECOG PS 0: 65%, PS 1: 35%; 20 (39%) pts received IMRT, and 32 (61%) pts received 3D-CRT. Fifty (96%) pts completed RT, and 50 pts received RT per protocol without a major deviation. The median (range) total RT dose administered was 72 (64−74) Gy. The most common grade _ 3 adverse events graded using the CTCAE version 3.0 are shown (Table). Conclusions: After the interim safety analysis, CONCERT-2 continues per protocol. Study enrollment is estimated to be completed by October 2009.
Resumo:
Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach - data stay mostly in the CSV files; "zero configuration" - no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.
Resumo:
Provides instructions for using the computer program which was developed under the research project, "The Economics of Reducing the County Road System: Three Case Studies In Iowa". This program operates on an IBP personal computer with 300K storage. A fixed disk is required with at least 3 megabytes of storage. The computer must be equipped with DOS version 3.0; the programs are written in Fortran. The user's manual describes all data requirements including network preparation, trip information, cost for maintenance, reconstruction, etc. Program operation instructions are presented, as well as sample solution output and a listing of the computer programs.