900 resultados para Local computer network
Resumo:
The increasing demand for Internet data traffic in wireless broadband access networks requires both the development of efficient, novel wireless broadband access technologies and the allocation of new spectrum bands for that purpose. The introduction of a great number of small cells in cellular networks allied to the complimentary adoption of Wireless Local Area Network (WLAN) technologies in unlicensed spectrum is one of the most promising concepts to attend this demand. One alternative is the aggregation of Industrial, Science and Medical (ISM) unlicensed spectrum to licensed bands, using wireless networks defined by Institute of Electrical and Electronics Engineers (IEEE) and Third Generation Partnership Project (3GPP). While IEEE 802.11 (Wi-Fi) networks are aggregated to Long Term Evolution (LTE) small cells via LTE / WLAN Aggregation (LWA), in proposals like Unlicensed LTE (LTE-U) and LWA the LTE air interface itself is used for transmission on the unlicensed band. Wi-Fi technology is widespread and operates in the same 5 GHz ISM spectrum bands as the LTE proposals, which may bring performance decrease due to the coexistence of both technologies in the same spectrum bands. Besides, there is the need to improve Wi-Fi operation to support scenarios with a large number of neighbor Overlapping Basic Subscriber Set (OBSS) networks, with a large number of Wi-Fi nodes (i.e. dense deployments). It is long known that the overall Wi-Fi performance falls sharply with the increase of Wi-Fi nodes sharing the channel, therefore there is the need for introducing mechanisms to increase its spectral efficiency. This work is dedicated to the study of coexistence between different wireless broadband access systems operating in the same unlicensed spectrum bands, and how to solve the coexistence problems via distributed coordination mechanisms. The problem of coexistence between different networks (i.e. LTE and Wi-Fi) and the problem of coexistence between different networks of the same technology (i.e. multiple Wi-Fi OBSSs) is analyzed both qualitatively and quantitatively via system-level simulations, and the main issues to be faced are identified from these results. From that, distributed coordination mechanisms are proposed and evaluated via system-level simulations, both for the inter-technology coexistence problem and intra-technology coexistence problem. Results indicate that the proposed solutions provide significant gains when compare to the situation without distributed coordination.
Resumo:
PURPOSE: Radiation therapy is used to treat cancer using carefully designed plans that maximize the radiation dose delivered to the target and minimize damage to healthy tissue, with the dose administered over multiple occasions. Creating treatment plans is a laborious process and presents an obstacle to more frequent replanning, which remains an unsolved problem. However, in between new plans being created, the patient's anatomy can change due to multiple factors including reduction in tumor size and loss of weight, which results in poorer patient outcomes. Cloud computing is a newer technology that is slowly being used for medical applications with promising results. The objective of this work was to design and build a system that could analyze a database of previously created treatment plans, which are stored with their associated anatomical information in studies, to find the one with the most similar anatomy to a new patient. The analyses would be performed in parallel on the cloud to decrease the computation time of finding this plan. METHODS: The system used SlicerRT, a radiation therapy toolkit for the open-source platform 3D Slicer, for its tools to perform the similarity analysis algorithm. Amazon Web Services was used for the cloud instances on which the analyses were performed, as well as for storage of the radiation therapy studies and messaging between the instances and a master local computer. A module was built in SlicerRT to provide the user with an interface to direct the system on the cloud, as well as to perform other related tasks. RESULTS: The cloud-based system out-performed previous methods of conducting the similarity analyses in terms of time, as it analyzed 100 studies in approximately 13 minutes, and produced the same similarity values as those methods. It also scaled up to larger numbers of studies to analyze in the database with a small increase in computation time of just over 2 minutes. CONCLUSION: This system successfully analyzes a large database of radiation therapy studies and finds the one that is most similar to a new patient, which represents a potential step forward in achieving feasible adaptive radiation therapy replanning.
Resumo:
This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)
Resumo:
Nuestra idea se desarrolla en Internet, donde personas están buscando bienes y servicios para suplir sus necesidades y problemas -- Lo que buscamos es identificar que problemas existen para nichos de mercado específicos a los cuales se les pueda servir por medio de páginas web -- Para cada nicho de mercado pueden existir problemas que no han sido servidos o que están descuidados por otras páginas web dentro de cada nicho que queramos servir -- De acuerdo a nuestra investigación de distintos nichos, buscaremos oportunidades en mercados donde no hayan sido entregadas soluciones por medio de Internet (no hay contenido al respecto o es pobre o encontramos un micro nicho sin atender) -- Nuestra estrategia de éxito es la siguiente: 1 -- Buscar nichos de mercado en Internet cuyas necesidades no hayan sido suplidas por medio de páginas web y donde el contenido que exista no sea muy bueno o se pueden hacer mejoras -- 2 -- Crear una página web con el contenido clave para suplir estas necesidades -- 3 -- Encontrar productos o servicios relacionados y dar una recomendación de compra (por medio de una comparación entre productos, análisis de sus beneficios, testimonios)
Resumo:
Stand-alone and networked surgical virtual reality based simulators have been proposed as means to train surgical skills with or without a supervisor nearby the student or trainee -- However, surgical skills teaching in medicine schools and hospitals is changing, requiring the development of new tools to focus on: (i) importance of mentors role, (ii) teamwork skills and (iii) remote training support -- For these reasons, a surgical simulator should not only allow the training involving a student and an instructor that are located remotely, but also the collaborative training of users adopting different medical roles during the training sesión -- Collaborative Networked Virtual Surgical Simulators (CNVSS) allow collaborative training of surgical procedures where remotely located users with different surgical roles can take part in the training session -- To provide successful training involving good collaborative performance, CNVSS should handle heterogeneity factors such as users’ machine capabilities and network conditions, among others -- Several systems for collaborative training of surgical procedures have been developed as research projects -- To the best of our knowledge none has focused on handling heterogeneity in CNVSS -- Handling heterogeneity in this type of collaborative sessions is important because not all remotely located users have homogeneous internet connections, nor the same interaction devices and displays, nor the same computational resources, among other factors -- Additionally, if heterogeneity is not handled properly, it will have an adverse impact on the performance of each user during the collaborative sesión -- In this document, the development of a context-aware architecture for collaborative networked virtual surgical simulators, in order to handle the heterogeneity involved in the collaboration session, is proposed -- To achieve this, the following main contributions are accomplished in this thesis: (i) Which and how infrastructure heterogeneity factors affect the collaboration of two users performing a virtual surgical procedure were determined and analyzed through a set of experiments involving users collaborating, (ii) a context-aware software architecture for a CNVSS was proposed and implemented -- The architecture handles heterogeneity factors affecting collaboration, applying various adaptation mechanisms and finally, (iii) A mechanism for handling heterogeneity factors involved in a CNVSS is described, implemented and validated in a set of testing scenarios
Resumo:
A actividade vitivinícola possui um conjunto diverso de características presentes no solo, território e comunidade que fazem parte do património cultural de uma determinada região. Quando a tradição se traduz num conceito como terroir que é formado por características territoriais, sociais e culturais de uma região rural, o vinho apresenta uma “assinatura” que se escreve “naturalmente” no paladar regionalmente identificado. Os vinhos da Região de Nemea, na Grécia e de Basto (Região dos Vinhos Verdes) em Portugal, estão ambos sob a proteção dos regulamentos das Denominações de Origem. No entanto, apesar de ambos serem regulados por sistemas institucionais de certificação e controlo de qualidade, afigura-se a necessidade de questionar se o património cultural e a identidade territorial específica, “impressa” em ambos os terroirs, pode ser protegida num sentido mais abrangente do que apenas origem e qualidade. Em Nemea, a discussão entre os produtores diz respeito ao estabelecimento de sub-zonas, isto é incluir na regulação PDO uma diferente categorização territorial com base no terroir. Ou seja, para além de estar presente no rótulo a designação PDO, as garrafas incluirão ainda informação certificada sobre a área específica (dentro do mesmo terroir) onde o vinho foi produzido. A acontecer resultaria em diferentes status de qualidade de acordo com as diferentes aldeias de Nemea onde as vinhas estão localizadas. O que teria possíveis impactos no valor das propriedades e no uso dos solos. Para além disso, a não participação da Cooperativa de Nemea na SON (a associação local de produtores de vinho) e como tal na discussão principal sobre as mudanças e os desafios sobre o terroir de Nemea constitui um problema no sector vitivinícola de Nemea. Em primeiro lugar estabelece uma relação de não-comunicação entre os dois mais importantes agentes desse sector – as companhias vinícolas e a Cooperativa. Em segundo lugar porque constituiu uma possibilidade real, não só para os viticultores ficarem arredados dessa discussão, como também (porque não representados pela cooperativa) ficar impossibilitado um consenso sobre as mudanças discutidas. Isto poderá criar um ‘clima’ de desconfiança levando a discussão para ‘arenas’ deslocalizadas e como tal para decisões ‘desterritorializadas’ Em Basto, há vários produtores que começaram a vender a sua produção para distribuidoras localizadas externamente à sub-região de Basto, mas dentro da Região dos Vinhos Verdes, uma vez que essas companhias tem um melhor estatuto nacional e internacional e uma melhor rede de exportações. Isto está ainda relacionado com uma competição por uma melhor rede de contactos e status mais forte, tornando as discussões sobre estratégias comuns para o desenvolvimento rural e regional de Basto mais difícil de acontecer (sobre isto a palavra impossível foi constantemente usada durante as entrevistas com os produtores de vinho). A relação predominante entre produtores é caracterizada por relações individualistas. Contudo foi observado que essas posições são ainda caracterizadas por uma desconfiança no interior da rede interprofissional local: conflitos para conseguir os mesmos potenciais clientes; comprar uvas a viticultores com melhor rácio qualidade/preço; estratégias individuais para conseguir um melhor status político na relação com a Comissão dos Vinhos Verdes. Para além disso a inexistência de uma activa intermediação institucional (autoridades municipais e a Comissão de Vinho Verde), a inexistência entre os produtores de Basto de uma associação ou mesmo a inexistência de uma cooperativa local tem levado a região de Basto a uma posição de subpromoção nas estratégias de promoção do Vinho Verde em comparação com outras sub-regiões. É também evidente pelos resultados que as mudanças no sector vitivinícolas na região de Basto têm sido estimuladas de fora da região (em resposta também às necessidades dos mercados internacionais) e raramente de dentro – mais uma vez, ‘arenas’ não localizadas e como tal decisões desterritorializadas. Nesse sentido, toda essa discussão e planeamento estratégico, terão um papel vital na preservação da identidade localizada do terroir perante os riscos de descaracterização e desterritorialização. Em suma, para ambos os casos, um dos maiores desafios parece ser como preservar o terroir vitivinícola e como tal o seu carácter e identidade local, quando a rede interprofissional em ambas as regiões se caracteriza, tanto por relações não-consensuais em Nemea como pelo modus operandi de isolamento sem comunicação em Basto. Como tal há uma necessidade de envolvimento entre os diversos agentes e as autoridades locais no sentido de uma rede localizada de governança. Assim sendo, em ambas as regiões, a existência dessa rede é essencial para prevenir os efeitos negativos na identidade do produto e na sua produção. Uma estratégia de planeamento integrado para o sector será vital para preservar essa identidade, prevenindo a sua desterritorialização através de uma restruturação do conhecimento tradicional em simultâneo com a democratização do acesso ao conhecimento das técnicas modernas de produção vitivinícola.
Resumo:
SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.
Resumo:
El presente texto ofrece algunas reflexiones teóricas y visionesdel autor sobre la relación del hombre de hoy con el fenómeno dela comunicación digital, también conocida como comunicaciónvirtual, a través de Internet, unas reflexiones que aparecen deltrabajo investigativo sobre el modelo convergente del canalde televisión pública Telemedellín, y en las que se explora elpanorama de las relaciones del hombre con el otro, consigomismo y con las cosas, desde la mediación de la computadora.Por otro lado, indaga sobre la composición de las redes socialesy su producto, la comunidad virtual, lugar en el que se producenlos intercambios e interrelaciones humanas. Y finalmente,intenta mostrar el desasosiego del hombre de la época actual,encontrándolo como un ser solitario que lucha por un lugar enel mundo.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Paula Frassinetti para obtenção do grau de Mestre em Intervenção Comunitária, especialização em Contextos de risco.
Resumo:
In this research work, a new routing protocol for Opportunistic Networks is presented. The proposed protocol is called PSONET (PSO for Opportunistic Networks) since the proposal uses a hybrid system composed of a Particle Swarm Optimization algorithm (PSO). The main motivation for using the PSO is to take advantage of its search based on individuals and their learning adaptation. The PSONET uses the Particle Swarm Optimization technique to drive the network traffic through of a good subset of forwarders messages. The PSONET analyzes network communication conditions, detecting whether each node has sparse or dense connections and thus make better decisions about routing messages. The PSONET protocol is compared with the Epidemic and PROPHET protocols in three different scenarios of mobility: a mobility model based in activities, which simulates the everyday life of people in their work activities, leisure and rest; a mobility model based on a community of people, which simulates a group of people in their communities, which eventually will contact other people who may or may not be part of your community, to exchange information; and a random mobility pattern, which simulates a scenario divided into communities where people choose a destination at random, and based on the restriction map, move to this destination using the shortest path. The simulation results, obtained through The ONE simulator, show that in scenarios where the mobility model based on a community of people and also where the mobility model is random, the PSONET protocol achieves a higher messages delivery rate and a lower replication messages compared with the Epidemic and PROPHET protocols.
Resumo:
Este Trabajo Fin de Grado, describe la implantación de un sistema de monitorización de redes informáticas. Se definirán los principales conceptos de monitorización, y se argumentará la elección de la herramienta finalmente seleccionada para llevarlo a cabo. Detallaremos el proceso de instalación, configuración y puesta en producción. Por último, se mostrará cómo funciona el sistema ya instalado sobre la red informática de la Empresa Pública de Turismo Andaluz, S.A., cubriendo las necesidades de control de sistemas desde su sede principal, sita en Málaga, al resto de provincias andaluzas, donde posee diversas sedes secundarias.
Resumo:
Have been less than thirty years since a group of graduate students and computer scientists working on a federal contract performed the first successful connection between two computers located at remote sites. This group known as the NWG Network Working Group, comprised of highly creative geniuses who as soon as they began meeting started talking about things like intellectual graphics, cooperating processes, automation questions, email, and many other interesting possibilities 1 . In 1968, the group's task was to design NWG's first computer network, in October 1969, the first data exchange occurred and by the end of that year a network of four computers was in operation. Since the invention of the telephone in 1876 no other technology has revolutionized the field of communications over the computer network. The number of people who have made great contributions to the creation and development of the Internet are many, the computer network a much more complex than the phone is the result of people of many nationalities and cultures. However, remember that some years later in 19732 two computer scientists Robert Kahn and Vinton Cerft created a more sophisticated communication program called Transmission Control Protocol - Internet Protocol TCP / IP which is still in force in the Internet today.
Resumo:
Esta dissertação desenvolve uma plataforma de controlo interactiva para edifícios inteligentes através de um sistema SCADA (Supervisory Control And Data Acquisition). Este sistema SCADA integra diferentes tipos de informações provenientes das várias tecnologias presentes em edifícios modernos (controlo da ventilação, temperatura, iluminação, etc.). A estratégia de controlo desenvolvida implementa um controlador em cascada hierárquica onde os "loops" interiores são executados pelos PLC's locais (Programmable Logic Controller), e o "loop" exterior é gerido pelo sistema SCADA centralizado, que interage com a rede local de PLC's. Nesta dissertação é implementado um controlador preditivo na plataforma SCADA centralizada. São apresentados testes efectuados para o controlo da temperatura e luminosidade de salas com uma grande área. O controlador preditivo desenvolvido tenta optimizar a satisfação dos utilizadores, com base nas preferências introduzidas em várias interfaces distribuídas, sujeito às restrições de minimização do desperdício de energia. De forma a executar o controlador preditivo na plataforma SCADA foi desenvolvido um canal de comunicação para permitir a comunicação entre a aplicação SCADA e a aplicação MATLAB, onde o controlador preditivo é executado. ABSTRACT: This dissertation develops an operational control platform for intelligent buildings using a SCADA system (Supervisory Control And Data Acquisition). This SCADA system integrates different types of information coming from the several technologies present in modem buildings (control of ventilation, temperature, illumination, etc.). The developed control strategy implements a hierarchical cascade controller where inner loops are performed by local PLCs (Programmable Logic Controller), and the outer loop is managed by the centralized SCADA system, which interacts with the entire local PLC network. ln this dissertation a Predictive Controller is implemented at the centralized SCADA platform. Tests applied to the control of temperature and luminosity in hugearea rooms are presented. The developed Predictive Controller tries to optimize the satisfaction of user explicit preferences coming from several distributed user-interfaces, subjected to the constraints of energy waste minimization. ln order to run the Predictive Controller at the SCADA platform a communication channel was developed to allow communication between the SCADA application and the MATLAB application where the Predictive Controller runs.
Resumo:
Los adelantos de la tecnología permiten que las comunicaciones tengan lugar a través de grandes distancias cada vez con mayor facilidad. Los computadores hablan a los computadores; la gente habla a los computadores y los computadores hablan a la gente. Este rápido cambio ha forzado a muchos de los medios corrientes de información hasta sus límites tecnológicos. Nuevas ideas de diseño y conceptos tecnológicos revolucionarios están surgiendo en todas partes, y con ellos la necesidad evidente de mejores sistemas de comunicación que brinden confiabilidad, seguridad y rápido desempeño, siendo nuestro objetivo brindar un conocimiento de los mismos.