852 resultados para SERVER
Resumo:
The main objective for this degree project was to analyze the Endpoint Security Solutions developed by Cisco, Microsoft and a third minor company solution represented by InfoExpress. The different solutions proposed are Cisco Network Admission Control, Microsoft Network Access Protection and InfoExpress CyberGatekeeper. An explanation of each solution functioning is proposed as well as an analysis of the differences between those solutions. This thesis work also proposes a tutorial for the installation of Cisco Network Admission Control for an easier implementation. The research was done by reading articles on the internet and by experimenting the Cisco Network Admission Control solution. My background knowledge about Cisco routing and ACL was also used. Based on the actual analysis done in this thesis, a conclusion was drawn that all existing solutions are not yet ready for large-scale use in corporate networks. Moreover all solutions are proprietary and incompatible. The future possible standard for Endpoint solution might be driven by Cisco and Microsoft and a rude competition begins between those two giants.
Resumo:
This report discusses developing a software log tool for analysis of industrial processes. The target was to develop software that can help electro Engineers for monitor and fault finding in industrial processes. The tool is called PLS (Process log server), and is developed in Visual Studio.NET Framework 2005. PLS works as a client with Beijer Electronics OPC Server. The program is able to read data from PLC (Programmable Logic Controller), trough the OPC Server. PLS connects to all kind of controllers that is supported by the Beijer Electronics OPC Server. Signal data is stored in a database for later analysis. Chosen signals data can easily be exported into a text file. The text file is adopted for import to MS Office Excel. User manual [UM-07] is written as a separate document. The software acted stable through the function test. The final product becomes a first-rate tool that is simple to use. As an advantage, the software can be developed with more functions in the future.
Resumo:
The main objective of this thesis work is to develop communication link between Runrev Revolution (IDE) and JADE (Multi-Agent System) through Socket programming using TCP/IP layer. These two independent platforms are connected using socket programming technique. Socket programming is considered to be newly emerging technology among these two platforms, the work done in this thesis work is considered to be a prototype.A Graphical simulation model is developed by salixphere (Company in Hedemora) to simulate logistic problems using Runrev Revolution (IDE). The simulation software/program is called “BIOSIM”. The logistic problems are complex, and conventional optimization techniques are unlikely very successful. “BIOSIM” can demonstrate the graphical representation of logistic problems depending upon the problem domains. As this simulation model is developed in revolution programming language (Transcript) which is dynamically typed and English-like language, it is quite slow compared to other high level programming languages. The object of this thesis work is to add intelligent behaviour in graphical objects and develop communication link between Runrev revolution (IDE) and JADE (Multi-Agent System) using TCP/IP layers.The test shows the intelligent behaviour in the graphical objects and successful communication between Runrev Revolution (IDE) and JADE (Multi-Agent System).
Resumo:
The report analyses if some common problems can be avoided by using modern technology. As a reference system “Fartygsrapporteringssystemet” is used. It is an n-tier web application built with modern technology at time, 2003-2004. The aim is to examine whether ASP.Net MVC, Windows Communication Foundation, Workflow Foundation and SQL Server 2005 Service Broker can be used to create an n-tier web application which also communicate with other systems and facilitate automated testing. The report describes the construction of a prototype in which the presentation layer uses ASP.Net MVC to separate presentation and business logic. Communication with the business layer is done through the Windows Communication Foundation. Hard coded processes are broken out and dealt with by Workflow Foundation. Asynchronous communication with other systems is done by using Microsoft SQL Server 2005 Service Broker. The results of the analysis is that these techniques can be used to create a n-tier web application, but that ASP.Net MVC, which at present only available in a preview release, is not sufficiently developed yet.
Resumo:
Background: The sensitivity to microenvironmental changes varies among animals and may be under genetic control. It is essential to take this element into account when aiming at breeding robust farm animals. Here, linear mixed models with genetic effects in the residual variance part of the model can be used. Such models have previously been fitted using EM and MCMC algorithms. Results: We propose the use of double hierarchical generalized linear models (DHGLM), where the squared residuals are assumed to be gamma distributed and the residual variance is fitted using a generalized linear model. The algorithm iterates between two sets of mixed model equations, one on the level of observations and one on the level of variances. The method was validated using simulations and also by re-analyzing a data set on pig litter size that was previously analyzed using a Bayesian approach. The pig litter size data contained 10,060 records from 4,149 sows. The DHGLM was implemented using the ASReml software and the algorithm converged within three minutes on a Linux server. The estimates were similar to those previously obtained using Bayesian methodology, especially the variance components in the residual variance part of the model. Conclusions: We have shown that variance components in the residual variance part of a linear mixed model can be estimated using a DHGLM approach. The method enables analyses of animal models with large numbers of observations. An important future development of the DHGLM methodology is to include the genetic correlation between the random effects in the mean and residual variance parts of the model as a parameter of the DHGLM.
Resumo:
This project involves the design and implementation of a global electronic tracking system intended for use by trans-oceanic vessels, using the technology of the U.S. Government's Global Positioning System (GPS) and a wireless connection to a networked computer. Traditional navigation skills are being replaced with highly accurate electronics. GPS receivers, computers, and mobile communication are becoming common among both recreational and commercial boaters. With computers and advanced communication available throughout the maritime world, information can be shared instantaneously around the globe. This ability to monitor one's whereabouts from afar can provide an increased level of safety and efficiency. Current navigation software seldom includes the capability of providing upto-the-minute navigation information for remote display. Remote access to this data will allow boat owners to track the progress of their boats, land-based organizations to monitor weather patterns and suggest course changes, and school groups to track the progress of a vessel and learn about navigation and science. The software developed in this project allows navigation information from a vessel to be remotely transmitted to a land-based server, for interpretation and deployment to remote users over the Internet. This differs from current software in that it allows the tracking of one vessel by multiple users and provides a means for two-way text messaging between users and the vesseI. Beyond the coastal coverage provided by cellular telephones, mobile communication is advancing rapidly. Current tools such as satellite telephones and single-sideband radio enable worldwide communications, including the ability to connect to the Internet. If current trends continue, portable global communication will be available at a reasonable price and Internet connections on boats will become more common.
Resumo:
An underwater gas pipeline is the portion of the pipeline that crosses a river beneath its bottom. Underwater gas pipelines are subject to increasing dangers as time goes by. An accident at an underwater gas pipeline can lead to technological and environmental disaster on the scale of an entire region. Therefore, timely troubleshooting of all underwater gas pipelines in order to prevent any potential accidents will remain a pressing task for the industry. The most important aspect of resolving this challenge is the quality of the automated system in question. Now the industry doesn't have any automated system that fully meets the needs of the experts working in the field maintaining underwater gas pipelines. Principle Aim of this Research: This work aims to develop a new system of automated monitoring which would simplify the process of evaluating the technical condition and decision making on planning and preventive maintenance and repair work on the underwater gas pipeline. Objectives: Creation a shared model for a new, automated system via IDEF3; Development of a new database system which would store all information about underwater gas pipelines; Development a new application that works with database servers, and provides an explanation of the results obtained from the server; Calculation of the values MTBF for specified pipelines based on quantitative data obtained from tests of this system. Conclusion: The new, automated system PodvodGazExpert has been developed for timely and qualitative determination of the physical conditions of underwater gas pipeline; The basis of the mathematical analysis of this new, automated system uses principal component analysis method; The process of determining the physical condition of an underwater gas pipeline with this new, automated system increases the MTBF by a factor of 8.18 above the existing system used today in the industry.
Resumo:
Driven by Web 2.0 technology and the almost ubiquitous presence of mobile devices, Volunteered Geographic Information (VGI) is knowing an unprecedented growth. These notable technological advancements have opened fruitful perspectives also in the field of water management and protection, raising the demand for a reconsideration of policies which also takes into account the emerging trend of VGI. This research investigates the opportunity of leveraging such technology to involve citizens equipped with common mobile devices (e.g. tablets and smartphones) in a campaign of report of water-related phenomena. The work is carried out in collaboration with ADBPO - Autorità di bacino del fiume Po (Po river basin Authority), i.e. the entity responsible for the environmental planning and protection of the basin of river Po. This is the longest Italian river, spreading over eight among the twenty Italian Regions and characterized by complex environmental issues. To enrich ADBPO official database with user-generated contents, a FOSS (Free and Open Source Software) architecture was designed which allows not only user field-data collection, but also data Web publication through standard protocols. Open Data Kit suite allows users to collect georeferenced multimedia information using mobile devices equipped with location sensors (e.g. the GPS). Users can report a number of environmental emergencies, problems or simple points of interest related to the Po river basin, taking pictures of them and providing other contextual information. Field-registered data is sent to a server and stored into a PostgreSQL database with PostGIS spatial extension. GeoServer provides then data dissemination on the Web, while specific OpenLayers-based viewers were built to optimize data access on both desktop computers and mobile devices. Besides proving the suitability of FOSS in the frame of VGI, the system represents a successful prototype for the exploitation of user local, real-time information aimed at managing and protecting water resources.
Resumo:
Este trabalho apresenta uma arquitetura para Ambientes de Desenvolvimento de Software (ADS). Esta arquitetura é baseada em produtos comerciais de prateleira (COTS), principalmente em um Sistema de Gerência de Workflow – SGW (Microsoft Exchange 2000 Server – E2K) - e tem como plataforma de funcionamento a Internet, integrando também algumas ferramentas que fazem parte do grande conjunto de aplicativos que é utilizado no processo de desenvolvimento de software. O desenvolvimento de um protótipo (WOSDIE – WOrkflow-based Software Development Integrated Environment) baseado na arquitetura apresentada é descrito em detalhes, mostrando as etapas de construção, funções implementadas e dispositivos necessários para a integração de um SGW, ferramentas de desenvolvimento, banco de dados (WSS – Web Storage System) e outros, para a construção de um ADS. O processo de software aplicado no WOSDIE foi extraído do RUP (Rational Unified Process – Processo Unificado Rational). Este processo foi modelado na ferramenta Workflow Designer, que permite a modelagem dos processos de workflow dentro do E2K. A ativação de ferramentas a partir de um navegador Web e o armazenamento dos artefatos produzidos em um projeto de software também são abordados. O E2K faz o monitoramento dos eventos que ocorrem dentro do ambiente WOSDIE, definindo, a partir das condições modeladas no Workflow Designer, quais atividades devem ser iniciadas após o término de alguma atividade anterior e quem é o responsável pela execução destas novas atividades (assinalamento de atividades). A arquitetura proposta e o protótipo WOSDIE são avaliados segundo alguns critérios retirados de vários trabalhos. Estas avaliações mostram em mais detalhes as características da arquitetura proposta e proporcionam uma descrição das vantagens e problemas associados ao WOSDIE.
Resumo:
A pesquisa versa sobre a importância da liderança para uma eficiente prestação jurisdicional, a qual resulta de um concatenado trabalho de equipe entre o juiz e os servidores de uma determinada unidade judicial. A necessidade de o magistrado contribuir para a formação de uma equipe de alto desempenho para o enfrentamento da intempestividade da prestação jurisdicional, principal problema do Poder Judiciário, buscando as almejadas celeridade e efetividade da justiça. A qualidade da prestação jurisdicional dependerá da gestão imprimida pelo juiz na sua unidade. A pesquisa aponta para a necessidade de o juiz exercer sua atividade meio como um líder, dominando técnicas modernas de gestão e também influenciando sua equipe em busca da excelência. Será analisado o tipo de liderança que melhor se amolda à judicatura, que é a servidora. Nela o magistrado identifica as necessidades de seus funcionários e trata de supri-las, recebendo em troca a máxima dedicação e motivação possíveis. O juiz líder administra sua unidade e forja uma equipe qualificada e motivada. Em face da impossibilidade de conceituar o juiz líder, são apontados os seus traços característicos, que o tornam o novo modelo do juiz brasileiro: o juiz líder servidor. Identificação dessas características na dimensão organizacional, na dimensão interpessoal, bem como as habilidades pessoais que o definem. Por fim, é abordada a necessidade de o Judiciário refletir sobre o atual sistema de recrutamento seletivo dos magistrados, priorizando a escolha sobre pessoas vocacionadas e, após a seleção, investir na constante formação de líderes nos seus quadros que, alinhados, proporcionarão uma justiça célere, humana, justa e efetiva.
Resumo:
A autonomia pessoal do servidor público, em seu agir na Administração Pública, é um dos pressupostos para a eficaz implementação de ações de gestão do conhecimento. Ela também é um anseio do trabalhador, sempre defendido em manifestações das mais diversas associações de classe. Contudo, ela esbarra em restrições políticas, legais, administrativas e culturais. Este trabalho, debruçado sobre fontes secundárias e teóricas, identificou a natureza da autonomia pessoal, suas modalidades, suas fontes, suas restrições, bem como sua possibilidade de desenvolvimento. O trabalho, de natureza teórica, foi desenvolvido por meio de interpretação transdisciplinar das fontes, em sua maior parte oriundas da literatura sociológica, administrativa, do direito e da filosofia. O conceito de autonomia é trabalhado inicialmente, seguido por sua primeira subdivisão em duas dimensões. Em seguida, a disciplina que a doutrina de Direito Administrativo brasileiro impõe à autonomia do servidor público é explorada e problematizada. Em seguida, é abordada a questão sob a visão sociológica, a partir do modelo burocrático ideal de Max Weber e das constatações de Michel Crozier. A relação entre a autonomia e as burocracias profissionais também é passada em revista. Por fim, a personalidade humana é apresentada como a fonte da autonomia, bem como sua justificação diante de doutrinas que a negam e atacam. Foram identificadas três dimensões da autonomia: substantiva, técnica e objetiva; bem como propostos caminhos para que, nas organizações públicas, essas dimensões possam florescer, dentro dos legítimos limitantes políticos, legais e administrativos identificados.
Resumo:
Ao se reportar resultados voláteis e, sem a devida evidenciação contábil (disclosure), pode-se transmitir uma imagem negativa aos investidores e levantar dúvidas em relação aos resultados futuros, a transparência e a capacidade de gerenciamento do risco por parte dos gestores das instituições financeiras. Nas últimas décadas, a utilização da contabilidade de hedge para a gestão do risco e resultado tem estado em evidência nos grandes bancos do Brasil e do exterior. Isto ocorre pois é onde se dá a convergência das demonstrações financeiras tanto em 2005 na Europa quanto em 2010 no Brasil para o novo padrão contábil internacional (IFRS) aplicado pelo IASB. Este padrão tem exigido dos bancos grandes esforços para estar em conformidade com as novas regras estabelecidas. Nesta mesma lógica, enquanto a contabilidade de hedge nos bancos assume um papel de destaque na gestão dos riscos e resultados; a divulgação precisa e concisa das demonstrações financeiras fornece aos acionistas, investidores e demais usuários importantes informações sobre o desempenho e a condução do negócio. Isto proporciona ao mercado uma melhor condição de avaliar os riscos envolvidos e de estimar os resultados futuros para a tomada de decisão de investimento. Dentro deste contexto, foi avaliado a qualidade e o grau de evidenciação das demonstrações contábeis dos principais bancos brasileiros e europeus aos requisitos do IFRS 7, IFRS 9 e outros mais de elaboração do próprio autor. Todos esses requisitos referem-se à divulgação de informações qualitativas e quantitativas pertinentes a contabilidade de hedge. Portanto, estão associados a estratégias de gestão de risco e resultado. A avaliação do grau de evidenciação das demonstrações financeiras ao IFRS 7 e IFRS 9 foi feita através de um estudo exploratório onde se analisou as notas explicativas em IFRS dos dez maiores bancos no Brasil e na Europa pelo critério “tamanho dos ativos”. Os resultados obtidos neste estudo indicam que 59,6% das instituições analisadas cumprem as exigências do IFRS7. Outra descoberta é que o índice de cumprimento dos bancos brasileiros é maior que os bancos europeus; 68,3% vs. 50,8%. Em relação ao IFRS 9 o percentual é de apenas 23% o que é explicado pelo fato da norma ainda não estar em vigor em ambas as regiões onde poucas instituições tem se antecipado de forma voluntária para atendê-la. A avaliação da qualidade das notas explicativas referente ao hedge contábil foi feita de maneira discricionária através da observação das informações prestadas para atender aos requisitos do IFRS 7 e 9 e dos demais requisitos adicionados pelo autor. Os resultados obtidos indicam que as notas carecem de maior detalhamento dos instrumentos de hedge utilizados, bem como os objetivos de cada hedge, para dar maior transparência ao usuário da informação sobre os riscos protegidos nos respectivos balanços. O crescimento do volume de informações prestadas nas notas explicativas dos grandes bancos brasileiros e europeus após a adoção do IFRS não configurou um aumento proporcional do conteúdo informacional, prevalecendo, ainda, a forma sobre a essência. Este movimento abre espaço para discussões futuras com os agentes de mercado sobre o tamanho e o conteúdo informacional adequado nas notas explicativas, com o intuito de buscar um equilíbrio entre o custo e o benefício da divulgação da informação sob a ótica da relevância e da materialidade.
Resumo:
O programa de desempenho por resultados do município de Santos apresentado como programa de qualidade do servidor parece não atingir justamente a parte mais importante para o seu sucesso: o próprio servidor. O envolvimento aquém do esperado e a perspectiva de dificuldades quanto ao não atingimento das metas estabelecidas estão identificados com a ausência de um elemento: a motivação. Heroica em batalhas quando se apresenta em momentos cruciais, torna-se vilã quando se ausenta. Conhece-la um pouco mais e identificar possibilidades de faze-la presente pode ser uma alternativa para vitaminar o PDR de Santos.
Resumo:
Although formal methods can dramatically increase the quality of software systems, they have not widely been adopted in software industry. Many software companies have the perception that formal methods are not cost-effective cause they are plenty of mathematical symbols that are difficult for non-experts to assimilate. The Java Modelling Language (short for JML) Section 3.3 is an academic initiative towards the development of a common formal specification language for Java programs, and the implementation of tools to check program correctness. This master thesis work shows how JML based formal methods can be used to formally develop a privacy sensitive Java application. This is a smart card application for managing medical appointments. The application is named HealthCard. We follow the software development strategy introduced by João Pestana, presented in Section 3.4. Our work influenced the development of this strategy by providing hands-on insight on challenges related to development of a privacy sensitive application in Java. Pestana’s strategy is based on a three-step evolution strategy of software specifications, from informal ones, through semiformal ones, to JML formal specifications. We further prove that this strategy can be automated by implementing a tool that generates JML formal specifications from a welldefined subset of informal software specifications. Hence, our work proves that JML-based formal methods techniques are cost-effective, and that they can be made popular in software industry. Although formal methods are not popular in many software development companies, we endeavour to integrate formal methods to general software practices. We hope our work can contribute to a better acceptance of mathematical based formalisms and tools used by software engineers. The structure of this document is as follows. In Section 2, we describe the preliminaries of this thesis work. We make an introduction to the application for managing medical applications we have implemented. We also describe the technologies used in the development of the application. This section further illustrates the Java Card Remote Method Invocation communication model used in the medical application for the client and server applications. Section 3 introduces software correctness, including the design by contract and the concept of contract in JML. Section 4 presents the design structure of the application. Section 5 shows the implementation of the HealthCard. Section 6 describes how the HealthCard is verified and validated using JML formal methods tools. Section 7 includes some metrics of the HealthCard implementation and specification. Section 8 presents a short example of how a client-side of a smart card application can be implemented while respecting formal specifications. Section 9 describes a prototype tools to generate JML formal specifications from informal specifications automatically. Section 10 describes some challenges and main ideas came acrorss during the development of the HealthCard. The full formal specification and implementation of the HealthCard smart card application presented in this document can be reached at https://sourceforge.net/projects/healthcard/.
Resumo:
Nowadays, the development of intelligent agents intends to be more refined, using improved architectures and reasoning mechanisms. Revise the beliefs of an agent is also an important subject, due to the consistency that agents should have about their knowledge. In this work we propose deliberative and argumentative agents using Lego Mindstorms robots, Argumentative NXT BDI-like Agents. These agents are built using the notions of the BDI model and they are capable to reason using the DeLP formalism. They update their knowledge base with their perceptions and revise it when necessary. Two variations are presented: the Single Argumentative NXT BDI-like Agent and the MAS Argumentative NXT BDI-like Agent.