804 resultados para 291706 Broadband Network Technology
Resumo:
Neural Networks as Cybernetic Systems is a textbox that combines classical systems theory with artificial neural network technology.
Resumo:
Neural Networks as Cybernetic Systems is a textbox that combines classical systems theory with artificial neural network technology. This third edition essentially compares with the 2nd one, but has been improved by correction of errors and by a rearrangement and minor expansion of the sections referring to recurrent networks. These changes hopefully allow for an easier comprehension of the essential aspects of this important domain that has received growing attention during the last years.
Resumo:
eural Networks as Cybernetic Systems is a textbox that combines classical systems theory with artificial neural network technology. This third edition essentially compares with the 2nd one, but has been improved by correction of errors and by a rearrangement and minor expansion of the sections referring to recurrent networks. These changes hopefully allow for an easier comprehension of the essential aspects of this important domain that has received growing attention during the last years.
Resumo:
For the selection of a firm's structure between vertical integration and arm's-length outsourcing, the importance of the thickness of the market had been emphasized in the previous literature. Here we take account of communication networks such as telephone, telex, fax, and the Internet. By doing so, we could illustrate the relationship between communication networks and the make-or-buy decision. With communication network technology differing in each type of firm, both vertically integrated firms and arm's-length outsourcing firms coexist, which was never indicated in the previous literature. However, when common network technology is introduced, such coexistence generically does not occur.
Resumo:
Las limitaciones de las tecnologías de red actuales, identificadas en la Agencia de Proyectos de Investigación Avanzados para la Defensa (DARPA) durante 1995, han originado recientemente una propuesta de modelo de red denominado Redes Activas. En este modelo, los nodos proporcionan un entorno de ejecución sobre el que se ejecuta el código asociado a cada paquete. El objetivo es disponer de una tecnología de red que permita que nuevos servicios de red sean desarrollados e instalados rápidamente sin modificar los nodos de la red. Un servicio de red que se puede beneficiar de esta tecnología es la transmisión de datos en multipunto con diferentes grados fiabilidad. Las propuestas actuales de servicios de multipunto fiable proporcionan una solución específica para cada clase de aplicaciones, y los protocolos existentes extremo a extremo sufren de limitaciones técnicas relacionadas con una fiabilidad limitada, y con la ausencia de mecanismos de control de congestión efectivos. Esta tesis realiza propuestas originales conducentes a solucionar parte de las limitaciones actuales en el ámbito de Redes Activas y multipunto fiable con control de congestión. En primer lugar, se especificará un servicio genérico de multipunto fiable que, basándose en los requisitos de una serie de aplicaciones consideradas relevantes, proporcione diferentes clases de sesiones y grados de fiabilidad. Partiendo de la definición del servicio genérico especificado, se diseñará un protocolo de comunicaciones sobre la tecnología de Redes Activas que proporcione dicho servicio. El protocolo diseñado estará dotado de un mecanismo de control de congestión para que la fuente ajuste dinámicamente el tráfico inyectado a las condiciones de carga de la red. En esta tesis se pretende también profundizar en el estudio y análisis de la tecnología de Redes Activas, experimentando con dicha tecnología para proporcionar una realimentación a sus diseñadores. Dicha experimentación se realizará en tres ámbitos: el de los servicios y protocolos que puede soportar, el del modelo y arquitectura de las Redes Activas y el de las plataformas de ejecución disponibles. Como aportación adicional de este trabajo, se validarán los objetivos anteriores mediante una implementación piloto de las entidades de protocolo y de su interfaz de servicio sobre uno de los entornos de ejecución disponibles. Abstract The limitations of current networking technologies identified in the Defense Advance Research Projects Agency (DARPA) along 1995 have led to a recent proposal of a new network model called Active Networks. In this model, the nodes provide an execution environment over which the code used to process each packet is executed. The objective is a network technology that allows the fast design and deployment of new network services without requiring the modification of the network nodes. One network service that could benefit from this technology is the transmission of multicast data with different types of loss tolerance. The current proposals for reliable multicast services provide specific solutions for each application class, and existing end-to-end protocols suffer from technical drawbacks related to limited reliability and lack of an effective congestion control mechanism. This thesis contains original proposals that aim to solve part of the current drawbacks in the scope of Active Networks and reliable multicast with congestion control. Firstly, a generic reliable multicast network service will be specified. This service will be designed from the requirements of a relevant set of applications, and will provide different session classes and different types of reliability. Then, a network protocol based on Active Network technology will be designed such that it provides the specified network service. This protocol will incorporate a congestion control mechanism capable of performing an automatic adjustment of the traffic injected by the source to the available network capacity. This thesis will also contribute to a deeper study and analysis of Active Network technology, by experimenting with the technology in order to provide feedback to its designers. This experimentation will be done attending to three different scopes: support of Active Network for services and protocols, Active Network model and architecture, and currently available Active Network execution environments. As an additional contribution of this work, the previous objectives will be validated through a prototype implementation of the protocol entities and the service interface based on one of the current execution environments.
Resumo:
O jornalismo é um dos principais meios de oferta de temas para a discussão e formação da opinião pública, porém depende de um sistema técnico para ser transmitido. Durante mais de cem anos as informações produzidas pela imprensa foram emitidas, armazenadas, transmitidas e recebidas pelos chamados veículos de comunicação de massa que utilizam a rede centralizada cujas características estão na escassez material, produção em série e massificação. Esse sistema separa no tempo e no espaço emissores e receptores criando uma relação desigual de força em que as grandes empresas controlaram o fluxo informativo, definindo quais fatos seriam veiculados como notícia. Em 1995, a internet cuja informação circula sob a tecnologia da rede distribuída, foi apropriada pela sociedade, alterando a forma de produção, armazenamento e transmissão de informação. A tecnologia despertou a esperança de que esta ferramenta poderia proporcionar uma comunicação mais dialógica e democrática. Mas aos poucos pode-se perceber novas empresas se apropriando da tecnologia da rede distribuída sob a qual circula a internet, gerando um novo controle do fluxo informativo. Realizou-se nessa pesquisa um levantamento bibliográfico para estabelecer uma reflexão crítica dos diferentes intermediários entre fato e a notícia tanto da rede centralizada como na rede distribuída, objetivando despertar uma discussão que possa oferecer novas ideias para políticas, bem como alternativas para uma comunicação mais democrática e mais libertária.
Resumo:
Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.
Resumo:
Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications at high data rates, in addition to high efficiency in the spectrum usage. On mobile wireless communication networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations where a terrestrial infrastructure is unavailable. The results show that good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. The dissertation proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. The issue of Cooperative Satellite Communications is solved through a new algorithm that forwards the received data from the fixed node to the mobile node. This algorithm is very efficient because it does not allow unnecessary transmissions and is based on signal to noise ratio (SNR) measures.
Resumo:
O jornalismo é um dos principais meios de oferta de temas para a discussão e formação da opinião pública, porém depende de um sistema técnico para ser transmitido. Durante mais de cem anos as informações produzidas pela imprensa foram emitidas, armazenadas, transmitidas e recebidas pelos chamados veículos de comunicação de massa que utilizam a rede centralizada cujas características estão na escassez material, produção em série e massificação. Esse sistema separa no tempo e no espaço emissores e receptores criando uma relação desigual de força em que as grandes empresas controlaram o fluxo informativo, definindo quais fatos seriam veiculados como notícia. Em 1995, a internet cuja informação circula sob a tecnologia da rede distribuída, foi apropriada pela sociedade, alterando a forma de produção, armazenamento e transmissão de informação. A tecnologia despertou a esperança de que esta ferramenta poderia proporcionar uma comunicação mais dialógica e democrática. Mas aos poucos pode-se perceber novas empresas se apropriando da tecnologia da rede distribuída sob a qual circula a internet, gerando um novo controle do fluxo informativo. Realizou-se nessa pesquisa um levantamento bibliográfico para estabelecer uma reflexão crítica dos diferentes intermediários entre fato e a notícia tanto da rede centralizada como na rede distribuída, objetivando despertar uma discussão que possa oferecer novas ideias para políticas, bem como alternativas para uma comunicação mais democrática e mais libertária.
Resumo:
The TOMO-ETNA experiment was devised to image of the crust underlying the volcanic edifice and, possibly, its plumbing system by using passive and active refraction/reflection seismic methods. This experiment included activities both on-land and offshore with the main objective of obtaining a new high-resolution seismic tomography to improve the knowledge of the crustal structures existing beneath the Etna volcano and northeast Sicily up to Aeolian Islands. The TOMO ETNA experiment was divided in two phases. The first phase started on June 15, 2014 and finalized on July 24, 2014, with the withdrawal of two removable seismic networks (a Short Period Network and a Broadband network composed by 80 and 20 stations respectively) deployed at Etna volcano and surrounding areas. During this first phase the oceanographic research vessel “Sarmiento de Gamboa” and the hydro-oceanographic vessel “Galatea” performed the offshore activities, which includes the deployment of ocean bottom seismometers (OBS), air-gun shooting for Wide Angle Seismic refraction (WAS), Multi-Channel Seismic (MCS) reflection surveys, magnetic surveys and ROV (Remotely Operated Vehicle) dives. This phase finished with the recovery of the short period seismic network. In the second phase the Broadband seismic network remained operative until October 28, 2014, and the R/V “Aegaeo” performed additional MCS surveys during November 19-27, 2014. Overall, the information deriving from TOMO-ETNA experiment could provide the answer to many uncertainties that have arisen while exploiting the large amount of data provided by the cutting-edge monitoring systems of Etna volcano and seismogenic area of eastern Sicily.
Resumo:
The petrochemical industry has as objective obtain, from crude oil, some products with a higher commercial value and a bigger industrial utility for energy purposes. These industrial processes are complex, commonly operating with large production volume and in restricted operation conditions. The operation control in optimized and stable conditions is important to keep obtained products quality and the industrial plant safety. Currently, industrial network has been attained evidence when there is a need to make the process control in a distributed way. The Foundation Fieldbus protocol for industrial network, for its interoperability feature and its user interface organized in simple configuration blocks, has great notoriety among industrial automation network group. This present work puts together some benefits brought by industrial network technology to petrochemical industrial processes inherent complexity. For this, a dynamic reconfiguration system for intelligent strategies (artificial neural networks, for example) based on the protocol user application layer is proposed which might allow different applications use in a particular process, without operators intervention and with necessary guarantees for the proper plant functioning
Resumo:
“Hardware in the Loop” (HIL) testing is widely used in the automotive industry. The sophisticated electronic control units used for vehicle control are usually tested and evaluated using HIL-simulations. The HIL increases the degree of realistic testing of any system. Moreover, it helps in designing the structure and control of the system under test so that it works effectively in the situations that will be encountered in the system. Due to the size and the complexity of interaction within a power network, most research is based on pure simulation. To validate the performance of physical generator or protection system, most testing is constrained to very simple power network. This research, however, examines a method to test power system hardware within a complex virtual environment using the concept of the HIL. The HIL testing for electronic control units and power systems protection device can be easily performed at signal level. But performance of power systems equipments, such as distributed generation systems can not be evaluated at signal level using HIL testing. The HIL testing for power systems equipments is termed here as ‘Power Network in the Loop’ (PNIL). PNIL testing can only be performed at power level and requires a power amplifier that can amplify the simulation signal to the power level. A power network is divided in two parts. One part represents the Power Network Under Test (PNUT) and the other part represents the rest of the complex network. The complex network is simulated in real time simulator (RTS) while the PNUT is connected to the Voltage Source Converter (VSC) based power amplifier. Two way interaction between the simulator and amplifier is performed using analog to digital (A/D) and digital to analog (D/A) converters. The power amplifier amplifies the current or voltage signal of simulator to the power level and establishes the power level interaction between RTS and PNUT. In the first part of this thesis, design and control of a VSC based power amplifier that can amplify a broadband voltage signal is presented. A new Hybrid Discontinuous Control method is proposed for the amplifier. This amplifier can be used for several power systems applications. In the first part of the thesis, use of this amplifier in DSTATCOM and UPS applications are presented. In the later part of this thesis the solution of network in the loop testing with the help of this amplifier is reported. The experimental setup for PNIL testing is built in the laboratory of Queensland University of Technology and the feasibility of PNIL testing has been evaluated using the experimental studies. In the last section of this thesis a universal load with power regenerative capability is designed. This universal load is used to test the DG system using PNIL concepts. This thesis is composed of published/submitted papers that form the chapters in this dissertation. Each paper has been published or submitted during the period of candidature. Chapter 1 integrates all the papers to provide a coherent view of wide bandwidth switching amplifier and its used in different power systems applications specially for the solution of power systems testing using PNIL.
Resumo:
The measurement of broadband ultrasonic attenuation (BUA) in cancellous bone at the calcaneus was first described in 1984. The assessment of osteoporosis by BUA has recently been recognized by Universities UK, within its EurekaUK book, as being one of the “100 discoveries and developments in UK Universities that have changed the world” over the past 50 years, covering the whole academic spectrum from the arts and humanities to science and technology. Indeed, BUA technique has been clinically validated and is utilized worldwide, with at least seven commercial systems providing calcaneal BUA measurement. However, a fundamental understanding of the dependence of BUA upon the material and structural properties of cancellous bone is still lacking. This review aims to provide a science- and technology-orientated perspective on the application of BUA to the medical disease of osteoporosis.
Australian Research to Encourage School Students’ Positive Use of Technology to Reduce Cyberbullying
Resumo:
Information and Communications Technology (ICT) has spread rapidly in Australia. Mobile phones, which increasingly have advanced capabilities including Internet access, mobile television and multimedia storage, are owned by 22% of Australian children aged 9-11 years and 73% of those aged 12-14 years (Australian Bureau of Statistics, 2012b), as well as by over 90% of Australians aged 15 years and over(Australian Communications and Media Authority (ACMA), 2010). Nearly 80% of Australian households have access to the Internet and 73% have a broadband Internet connection, ensuring that Internet access is typically reliable and high-speed (Australian Bureau of Statistics, 2012a). Ninety percent of Australian children aged 5-14 years (comprising 79% of 5-8 year olds; 96% of 9-11 year olds; and 98% of 12-14 year olds) reported having accessed the Internet during 2011-2012, a significant increase from 79% in 2008-2009 (Australian Bureau of Statistics, 2012b). Approximately 90% of 5-14 year olds have accessed the Internet both from home and from school, with close to 49% accessing the Internet from other places (Australian Bureau of Statistics, 2012b). Young people often make use of borrowed Internet access (e.g. in friends’ homes), commercial access (e.g. cybercafés), public access (e.g. libraries), and mobile device access in areas offering free Wi-Fi (Lim, 2009).