912 resultados para Digital communication models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.
The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.
In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relative sea-level rise has been a major factor driving the evolution of reef systems during the Holocene. Most models of reef evolution suggest that reefs preferentially grow vertically during rising sea level then laterally from windward to leeward, once the reef flat reaches sea level. Continuous lagoonal sedimentation ("bucket fill") and sand apron progradation eventually lead to reef systems with totally filled lagoons. Lagoonal infilling of One Tree Reef (southern Great Barrier Reef) through sand apron accretion was examined in the context of late Holocene relative sea-level change. This analysis was conducted using sedimentological and digital terrain data supported by 50 radiocarbon ages from fossil microatolls, buried patch reefs, foraminifera and shells in sediment cores, and recalibrated previously published radiocarbon ages. This data set challenges the conceptual model of geologically continuous sediment infill during the Holocene through sand apron accretion. Rapid sand apron accretion occurred between 6000 and 3000 calibrated yr before present B.P. (cal. yr B.P.); followed by only small amounts of sedimentation between 3000 cal. yr B.P. and present, with no significant sand apron accretion in the past 2 k.y. This hiatus in sediment infill coincides with a sea-level fall of similar to 1-1.3 m during the late Holocene (ca. 2000 cal. yr B.P.), which would have caused the turn-off of highly productive live coral growth on the reef flats currently dominated by less productive rubble and algal flats, resulting in a reduced sediment input to back-reef environments and the cessation in sand apron accretion. Given that relative sea-level variations of similar to 1 m were common throughout the Holocene, we suggest that this mode of sand apron development and carbonate production is applicable to most reef systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The X-parameter based nonlinear modelling tools have been adopted as the foundation for the advanced methodology
of experimental characterisation and design of passive nonlinear devices. Based upon the formalism of the Xparameters,
it provides a unified framework for co-design of antenna beamforming networks, filters, phase shifters and
other passive and active devices of RF front-end, taking into account the effect of their nonlinearities. The equivalent
circuits of the canonical elements are readily incorporated in the models, thus enabling evaluation of PIM effect on the
performance of individual devices and their assemblies. An important advantage of the presented methodology is its
compatibility with the industry-standard established commercial RF circuit simulator Agilent ADS.
The major challenge in practical implementation of the proposed approach is concerned with experimental retrieval of the X-parameters for canonical passive circuit elements. To our best knowledge commercial PIM testers and practical laboratory test instruments are inherently narrowband and do not allow for simultaneous vector measurements at the PIM and harmonic frequencies. Alternatively, existing nonlinear vector analysers (NVNA) support X-parameter measurements in a broad frequency bands with a range of stimuli, but their dynamic range is insufficient for the PIM characterisation in practical circuits. Further opportunities for adaptation of the X-parameters methodology to the PIM
characterisation of passive devices using the existing test instruments are explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents initial results of evaluating suitability of the conventional two-tone CW passive intermodulation (PIM) test for characterization of modulated signal distortion by passive nonlinearities in base station antennas and RF front-end. A comprehensive analysis of analog and digitally modulated waveforms in the transmission lines with weak distributed nonlinearity has been performed using the harmonic balance analysis and X-parameters in Advanced Design System (ADS) simulator. The nonlinear distortion metrics used in the conventional two-tone CW PIM test have been compared with the respective spectral metrics applied to the modulated waveforms, such as adjacent channel power ratio (ACPR) and error vector magnitude (EVM). It is shown that the results of two-tone CW PIM tests are consistent with the metrics used for assessment of signal integrity of both analog and digitally modulated waveforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to develop an intelligent digital mock-up (DMU) through integration of design and manufacturing disciplines to enable a better understanding of assembly related issues during design evolution. The intelligent DMU will contain tolerance information related to manufacturing capabilities so it can be used as a source for assembly simulations of realistic models to support the manufacturing decision making process within the design domain related to tolerance build ups. A literature review of the contributing research areas is presented, from which identification of the need for an intelligent DMU has been developed. The proposed methodology including the applications of cellular modelling and potential features of the intelligent DMU are presented and explained. Finally a conclusion examines the work to date and the future work to achieve an intelligent DMU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an information-driven society where the volume and value of produced and consumed data assumes a growing importance, the role of digital libraries gains particular importance. This work analyzes the limitations in current digital library management systems and the opportunities brought by recent distributed computing models. The result of this work is the implementation of the University of Aveiro integrated system for digital libraries and archives. It concludes by analyzing the system in production and proposing a new service oriented digital library architecture supported in a peer-to-peer infrastructure

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet Tra c, Internet Applications, Internet Attacks, Tra c Pro ling, Multi-Scale Analysis abstract Nowadays, the Internet can be seen as an ever-changing platform where new and di erent types of services and applications are constantly emerging. In fact, many of the existing dominant applications, such as social networks, have appeared recently, being rapidly adopted by the user community. All these new applications required the implementation of novel communication protocols that present di erent network requirements, according to the service they deploy. All this diversity and novelty has lead to an increasing need of accurately pro ling Internet users, by mapping their tra c to the originating application, in order to improve many network management tasks such as resources optimization, network performance, service personalization and security. However, accurately mapping tra c to its originating application is a di cult task due to the inherent complexity of existing network protocols and to several restrictions that prevent the analysis of the contents of the generated tra c. In fact, many technologies, such as tra c encryption, are widely deployed to assure and protect the con dentiality and integrity of communications over the Internet. On the other hand, many legal constraints also forbid the analysis of the clients' tra c in order to protect their con dentiality and privacy. Consequently, novel tra c discrimination methodologies are necessary for an accurate tra c classi cation and user pro ling. This thesis proposes several identi cation methodologies for an accurate Internet tra c pro ling while coping with the di erent mentioned restrictions and with the existing encryption techniques. By analyzing the several frequency components present in the captured tra c and inferring the presence of the di erent network and user related events, the proposed approaches are able to create a pro le for each one of the analyzed Internet applications. The use of several probabilistic models will allow the accurate association of the analyzed tra c to the corresponding application. Several enhancements will also be proposed in order to allow the identi cation of hidden illicit patterns and the real-time classi cation of captured tra c. In addition, a new network management paradigm for wired and wireless networks will be proposed. The analysis of the layer 2 tra c metrics and the di erent frequency components that are present in the captured tra c allows an e cient user pro ling in terms of the used web-application. Finally, some usage scenarios for these methodologies will be presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tese apresenta um estudo exploratório sobre sistemas de comunicação por luz visível e as suas aplicações em sistemas de transporte inteligentes como forma a melhorar a segurança nas estradas. Foram desenvolvidos neste trabalho, modelos conceptuais e analíticos adequados à caracterização deste tipo de sistemas. Foi desenvolvido um protótipo de baixo custo, capaz de suportar a disseminação de informação utilizando semáforos. A sua realização carece de um estudo detalhado, nomeadamente: i) foi necessário obter modelos capazes de descrever os padrões de radiação numa área de serviço pré-definida; ii) foi necessário caracterizar o meio de comunicações; iii) foi necessário estudar o comportamento de vários esquemas de modulação de forma a optar pelo mais robusto; finalmente, iv) obter a implementação do sistema baseado em FPGA e componentes discretos. O protótipo implementado foi testado em condições reais. Os resultados alcançados mostram os méritos desta solução, chegando mesmo a encorajar a utilização desta tecnologia em outros cenários de aplicação.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A informação digitalizada e nado digital, fruto do avanço tecnológico proporcionado pelas Tecnologias da Informação e Comunicação (TIC), bem como da filosofia participativa da Web 2.0, conduziu à necessidade de reflexão sobre a capacidade de os modelos atuais, para a organização e representação da informação, de responder às necessidades info-comunicacionais assim como o acesso à informação eletrónica pelos utilizadores em Instituições de Memória. O presente trabalho de investigação tem como objetivo a conceção e avaliação de um modelo genérico normativo e harmonizador para a organização e representação da informação eletrónica, num sistema de informação para o uso de utilizadores e profissionais da informação, no contexto atual colaborativo e participativo. A definição dos objetivos propostos teve por base o estudo e análise qualitativa das normas adotadas pelas instituições de memória, para os registos de autoridade, bibliográfico e formatos de representação. Após a concetualização, foi desenvolvido e avaliado o protótipo, essencialmente, pela análise qualitativa dos dados obtidos a partir de testes à recuperação da informação. A experiência decorreu num ambiente laboratorial onde foram realizados testes, entrevistas e inquéritos por questionário. A análise cruzada dos resultados, obtida pela triangulação dos dados recolhidos através das várias fontes, permitiu concluir que tanto os utilizadores como os profissionais da informação consideraram muito interessante a integração da harmonização normativa refletida nos vários módulos, a integração de serviços/ferramentas comunicacionais e a utilização da componente participativa/colaborativa da plataforma privilegiando a Wiki, seguida dos Comentários, Tags, Forum de discussão e E-mail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tese investiga a caracterização (e modelação) de dispositivos que realizam o interface entre os domínios digital e analógico, tal como os buffers de saída dos circuitos integrados (CI). Os terminais sem fios da atualidade estão a ser desenvolvidos tendo em vista o conceito de rádio-definido-por-software introduzido por Mitola. Idealmente esta arquitetura tira partido de poderosos processadores e estende a operação dos blocos digitais o mais próximo possível da antena. Neste sentido, não é de estranhar que haja uma crescente preocupação, no seio da comunidade científica, relativamente à caracterização dos blocos que fazem o interface entre os domínios analógico e digital, sendo os conversores digital-analógico e analógico-digital dois bons exemplos destes circuitos. Dentro dos circuitos digitais de alta velocidade, tais como as memórias Flash, um papel semelhante é desempenhado pelos buffers de saída. Estes realizam o interface entre o domínio digital (núcleo lógico) e o domínio analógico (encapsulamento dos CI e parasitas associados às linhas de transmissão), determinando a integridade do sinal transmitido. Por forma a acelerar a análise de integridade do sinal, aquando do projeto de um CI, é fundamental ter modelos que são simultaneamente eficientes (em termos computacionais) e precisos. Tipicamente a extração/validação dos modelos para buffers de saída é feita usando dados obtidos da simulação de um modelo detalhado (ao nível do transístor) ou a partir de resultados experimentais. A última abordagem não envolve problemas de propriedade intelectual; contudo é raramente mencionada na literatura referente à caracterização de buffers de saída. Neste sentido, esta tese de Doutoramento foca-se no desenvolvimento de uma nova configuração de medição para a caracterização e modelação de buffers de saída de alta velocidade, com a natural extensão aos dispositivos amplificadores comutados RF-CMOS. Tendo por base um procedimento experimental bem definido, um modelo estado-da-arte é extraído e validado. A configuração de medição desenvolvida aborda não apenas a integridade dos sinais de saída mas também do barramento de alimentação. Por forma a determinar a sensibilidade das quantias estimadas (tensão e corrente) aos erros presentes nas diversas variáveis associadas ao procedimento experimental, uma análise de incerteza é também apresentada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the years, the increased search and exchange of information lead to an increase of traffic intensity in todays optical communication networks. Coherent communications, using the amplitude and phase of the signal, reappears as one of the transmission techniques to increase the spectral efficiency and throughput of optical channels. In this context, this work present a study on format conversion of modulated signals using MZI-SOAs, based exclusively on all- optical techniques through wavelength conversion. This approach, when applied in interconnection nodes between optical networks with different bit rates and modulation formats, allow a better efficiency and scalability of the network. We start with an experimental characterization of the static and dynamic properties of the MZI-SOA. Then, we propose a semi-analytical model to describe the evolution of phase and amplitude at the output of the MZI-SOA. The model’s coefficients are obtained using a multi-objective genetic algorithm. We validate the model experimentally, by exploring the dependency of the optical signal with the operational parameters of the MZI-SOA. We also propose an all-optical technique for the conversion of amplitude modulation signals to a continuous phase modulation format. Finally, we study the potential of MZI-SOAs for the conversion of amplitude signals to QPSK and QAM signals. We show the dependency of the conversion process with the operational parameters deviation from the optimal values. The technique is experimentally validated for QPSK modulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Communication and cooperation between billions of neurons underlie the power of the brain. How do complex functions of the brain arise from its cellular constituents? How do groups of neurons self-organize into patterns of activity? These are crucial questions in neuroscience. In order to answer them, it is necessary to have solid theoretical understanding of how single neurons communicate at the microscopic level, and how cooperative activity emerges. In this thesis we aim to understand how complex collective phenomena can arise in a simple model of neuronal networks. We use a model with balanced excitation and inhibition and complex network architecture, and we develop analytical and numerical methods for describing its neuronal dynamics. We study how interaction between neurons generates various collective phenomena, such as spontaneous appearance of network oscillations and seizures, and early warnings of these transitions in neuronal networks. Within our model, we show that phase transitions separate various dynamical regimes, and we investigate the corresponding bifurcations and critical phenomena. It permits us to suggest a qualitative explanation of the Berger effect, and to investigate phenomena such as avalanches, band-pass filter, and stochastic resonance. The role of modular structure in the detection of weak signals is also discussed. Moreover, we find nonlinear excitations that can describe paroxysmal spikes observed in electroencephalograms from epileptic brains. It allows us to propose a method to predict epileptic seizures. Memory and learning are key functions of the brain. There are evidences that these processes result from dynamical changes in the structure of the brain. At the microscopic level, synaptic connections are plastic and are modified according to the dynamics of neurons. Thus, we generalize our cortical model to take into account synaptic plasticity and we show that the repertoire of dynamical regimes becomes richer. In particular, we find mixed-mode oscillations and a chaotic regime in neuronal network dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This co-edited book focuses on core theories and research on technologies, from the first audio guides to contemporary and future mobile digital devices, which inform practical design considerations. It is framed in case studies and focuses generally on informal learning by museum and gallery visitors. The book fills a significant gap in the literature on museum practice with regard to uses of digital technologies, which are not generally grounded in rigorous research, and is intended to retain its relevance as technologies evolve and emerge. The book includes chapters by invited authors from the USA, UK and Europe who contribute expertise in a number of areas of museum research and practice. The research resulted in invited keynote speeches in France (‘Technologie de l’apprentissage humain dans les musées’ seminar at Laboratoire d’Informatique de Grenoble on 5 March 2009), Iceland (keynote at ‘NODEM Network of Design and Digital Heritage’ conference on 3 December 2008) and London (Keynote at ‘Mobile Learning Conference’ on 26 January 2009). The book was given the highest recommendation ('Essential') by the American Library Association, and was reviewed in MedieKultur (2011, 50, 185–92). Walker’s chapter includes some of the initial findings from his PhD research on visitor-constructed trails in museums, which shifts focus from the design of technologies to the design of activities intended to structure the use of technologies, and constitutes some of the first published research on visitor-generated trails using mobile technologies. Structures such as trails are shown to act as effective mental models for museum visitors, especially structures with a narrow subject focus and manageable amount of data capture; those created as a narrative or a conversation; and those that emphasise construction, rather than data capture. Walker also selected most of the other chapter authors, suggested their topics and led the editing of the publication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Das virtuelle soziale Netzwerk Facebook feiert seinen zehnten Geburtstag. Mit über einer Milliarde aktiver Nutzer ist es seit seiner Entstehung zur weltweit größten Internetplattform zur Kommunikation avanciert. Dennoch gibt es in Deutschland eine große Anzahl an Menschen, die sich zwar täglich im Internet bewegt, aber auf eine Mitgliedschaft bei Facebook verzichtet. In dieser Arbeit werden die Gründe untersucht, warum manche Personen Facebook nicht nutzen. Die Leitfrage der Arbeit lautet: „Warum nutzen ausgewählte deutsche Internetnutzer Facebook nicht?“. Es wird zwischen zwei unterschiedlichen Personenkreisen, den Nicht- und den Ex-Nutzern, unterschieden. Basierend auf Leitfadeninterviews mit 25 Befragten, die mittels einer qualitativen Inhaltsanalyse ausgewertet werden, werden elf verschiedene Gründe für eine Verweigerung von Facebook identifiziert. Für die Nicht-Nutzer stellt die Art der Kommunikation den zentralen Grund dar, Facebook nicht zu verwenden. Die Ex-Nutzer wiederum sehen den fehlenden Nutzen der Anwendung als wichtigstes Argument gegen Facebook.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho de projeto de mestrado, Tecnologias e Metodologias em E-learning, Universidade de Lisboa, Instituto de Educação, Faculdade de Ciências, 2013