983 resultados para multiple channels
Resumo:
The development of next generation microwave technology for backhauling systems is driven by an increasing capacity demand. In order to provide higher data rates and throughputs over a point-to-point link, a cost-effective performance improvement is enabled by an enhanced energy-efficiency of the transmit power amplification stage, whereas a combination of spectrally efficient modulation formats and wider bandwidths is supported by amplifiers that fulfil strict constraints in terms of linearity. An optimal trade-off between these conflicting requirements can be achieved by resorting to flexible digital signal processing techniques at baseband. In such a scenario, the adaptive digital pre-distortion is a well-known linearization method, that comes up to be a potentially widely-used solution since it can be easily integrated into base stations. Its operation can effectively compensate for the inter-modulation distortion introduced by the power amplifier, keeping up with the frequency-dependent time-varying behaviour of the relative nonlinear characteristic. In particular, the impact of the memory effects become more relevant and their equalisation become more challenging as the input discrete signal feature a wider bandwidth and a faster envelope to pre-distort. This thesis project involves the research, design and simulation a pre-distorter implementation at RTL based on a novel polyphase architecture, which makes it capable of operating over very wideband signals at a sampling rate that complies with the actual available clock speed of current digital devices. The motivation behind this structure is to carry out a feasible pre-distortion for the multi-band spectrally efficient complex signals carrying multiple channels that are going to be transmitted in near future high capacity and reliability microwave backhaul links.
Resumo:
Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.
Resumo:
Since the Sarbanes-Oxley Act was passed in 2002, it has become commonplace in the advertising industry to use creativity-award-show prizes instead of gross income figures to attract new customers. Therefore, achieving a top creativity ranking and winning creativity awards have become high priorities in the advertising industry. Agencies and marketers have always wondered what elements in the advertising creation process would lead to the winning of creativity awards. Although this debate has been dominated by pure speculation about the success of different routines, approaches and strategies in winning creativity awards, for the first time our study delivers an empirical insight into the key drivers of creativity award success. We investigate what strategies and which elements of an advertising campaign are truly likely to lead to winning the maximum number of creativity awards. Using a sample of 108 campaigns, we identify factors that influence campaign success at international advertising award shows. We identify innovativeness and the integration of multiple channels as the key drivers of creativity award success. In contrast to industry beliefs, meaningful or personally connecting approaches do not seem to generate a significant benefit in terms of winning creativity awards. Finally, our data suggest that the use of so-called “fake campaigns” to win more creativity awards does not prove to be effective.
Resumo:
Exergames are digital games with a physical exertion component. Exergames can help motivate fitness in people not inclined toward exercise. However, players of exergames sometimes over-exert, risking adverse health effects. These players must be told to slow down, but doing so may distract them from gameplay and diminish their desire to keep exercising. In this thesis we apply the concept of nudges—indirect suggestions that gently push people toward a desired behaviour—to keeping exergame players from over-exerting. We describe the effective use of nudges through a set of four design principles: natural integration, comprehension, progression, and multiple channels. We describe two exergames modified to use nudges to persuade players to slow down, and describe the studies evaluating the use of nudges in these games. PlaneGame shows that nudges can be as effective as an explicit textual display to control player over-exertion. Gekku Race demonstrates that nudges are not necessarily effective when players have a strong incentive to over-exert. However, Gekku Race also shows that, even in high-energy games, the power of nudges can be maintained by adding negative consequences to the nudges. We use the term "shove" to describe a nudge using negative consequences to increase its pressure. We were concerned that making players slow down would damage their immersion—the feeling of being engaged with a game. However, testing showed no loss of immersion through the use of nudges to reduce exertion. Players reported that the nudges and shoves motivated them to slow down when they were over-exerting, and fit naturally into the games.
Resumo:
Tese de doutoramento, Geologia (Hidrogeologia), Universidade de Lisboa, Faculdade de Ciências, 2016
Resumo:
Globalization is both an integrative and deconstructive process. Globalization integrates states and non-state actors into transnational and global networks (Keohane & Nye, 2000, p. 105). These networks are based on multiple channels of interdependence that include trade, politics, security, environment, and socio-cultural ties (pp. 106-107). Due to advances in telecommunications technology, the expansion of globalization “shrinks” the distance between peoples (p. 105). On the other hand, globalization can also break up the existing political and social order (Mathews, 1997, p. 50). Globalization disperses power and information flows, thus enabling local and transnational identity movements to challenge states (pp. 51-52). This can be exemplified by separatist movements that seek to break away from central authorities.
Resumo:
A specialised reconfigurable architecture is targeted at wireless base-band processing. It is built to cater for multiple wireless standards. It has lower power consumption than the processor-based solution. It can be scaled to run in parallel for processing multiple channels. Test resources are embedded on the architecture and testing strategies are included. This architecture is functionally partitioned according to the common operations found in wireless standards, such as CRC error correction, convolution and interleaving. These modules are linked via Virtual Wire Hardware modules and route-through switch matrices. Data can be processed in any order through this interconnect structure. Virtual Wire ensures the same flexibility as normal interconnects, but the area occupied and the number of switches needed is reduced. The testing algorithm scans all possible paths within the interconnection network exhaustively and searches for faults in the processing modules. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This paper compares various base-band processing solutions. It describes the proposed platform and its implementation. It outlines the test resources and algorithm. It concludes with the mapping of Bluetooth and GSM base-band onto the platform.
Resumo:
Purpose: The purpose of this paper is to investigate the use of 802.11e MAC to resolve the transmission control protocol (TCP) unfairness. Design/methodology/approach: The paper shows how a TCP sender may adapt its transmission rate using the number of hops and the standard deviation of recently measured round-trip times to address the TCP unfairness. Findings: Simulation results show that the proposed techniques provide even throughput by providing TCP fairness as the number of hops increases over a wireless mesh network (WMN). Research limitations/implications: Future work will examine the performance of TCP over routing protocols, which use different routing metrics. Other future work is scalability over WMNs. Since scalability is a problem with communication in multi-hop, carrier sense multiple access (CSMA) will be compared with time division multiple access (TDMA) and a hybrid of TDMA and code division multiple access (CDMA) will be designed that works with TCP and other traffic. Finally, to further improve network performance and also increase network capacity of TCP for WMNs, the usage of multiple channels instead of only a single fixed channel will be exploited. Practical implications: By allowing the tuning of the 802.11e MAC parameters that have previously been constant in 802.11 MAC, the paper proposes the usage of 802.11e MAC on a per class basis by collecting the TCP ACK into a single class and a novel congestion control method for TCP over a WMN. The key feature of the proposed TCP algorithm is the detection of congestion by measuring the fluctuation of RTT of the TCP ACK samples via the standard deviation, plus the combined the 802.11e AIFS and CWmin allowing the TCP ACK to be prioritised which allows the TCP ACKs will match the volume of the TCP data packets. While 802.11e MAC provides flexibility and flow/congestion control mechanism, the challenge is to take advantage of these features in 802.11e MAC. Originality/value: With 802.11 MAC not having flexibility and flow/congestion control mechanisms implemented with TCP, these contribute to TCP unfairness with competing flows. © Emerald Group Publishing Limited.
Resumo:
The WDM properties of dispersion managed (DM) solitons and the reduction in Gordon-Haus jitter means that it is possible to contemplate multiple channels each at 10 Gbit/s for transoceanic distances without the need for elaborate soliton control. This paper will concentrate on fundamental principles of DM solitons, but will use these principles to indicate optimum maps for future high-speed soliton systems.
Resumo:
Adapting to blurred or sharpened images alters perceived blur of a focused image (M. A. Webster, M. A. Georgeson, & S. M. Webster, 2002). We asked whether blur adaptation results in (a) renormalization of perceived focus or (b) a repulsion aftereffect. Images were checkerboards or 2-D Gaussian noise, whose amplitude spectra had (log-log) slopes from -2 (strongly blurred) to 0 (strongly sharpened). Observers adjusted the spectral slope of a comparison image to match different test slopes after adaptation to blurred or sharpened images. Results did not show repulsion effects but were consistent with some renormalization. Test blur levels at and near a blurred or sharpened adaptation level were matched by more focused slopes (closer to 1/f) but with little or no change in appearance after adaptation to focused (1/f) images. A model of contrast adaptation and blur coding by multiple-scale spatial filters predicts these blur aftereffects and those of Webster et al. (2002). A key proposal is that observers are pre-adapted to natural spectra, and blurred or sharpened spectra induce changes in the state of adaptation. The model illustrates how norms might be encoded and recalibrated in the visual system even when they are represented only implicitly by the distribution of responses across multiple channels.
Resumo:
The WDM properties of dispersion managed (DM) solitons and the reduction in Gordon-Haus jitter means that it is possible to contemplate multiple channels each at 10 Gbit/s for transoceanic distances without the need for elaborate soliton control. This paper will concentrate on fundamental principles of DM solitons, but will use these principles to indicate optimum maps for future high-speed soliton systems.
Resumo:
Exergames are digital games with a physical exertion component. Exergames can help motivate fitness in people not inclined toward exercise. However, players of exergames sometimes over-exert, risking adverse health effects. These players must be told to slow down, but doing so may distract them from gameplay and diminish their desire to keep exercising. In this thesis we apply the concept of nudges—indirect suggestions that gently push people toward a desired behaviour—to keeping exergame players from over-exerting. We describe the effective use of nudges through a set of four design principles: natural integration, comprehension, progression, and multiple channels. We describe two exergames modified to use nudges to persuade players to slow down, and describe the studies evaluating the use of nudges in these games. PlaneGame shows that nudges can be as effective as an explicit textual display to control player over-exertion. Gekku Race demonstrates that nudges are not necessarily effective when players have a strong incentive to over-exert. However, Gekku Race also shows that, even in high-energy games, the power of nudges can be maintained by adding negative consequences to the nudges. We use the term "shove" to describe a nudge using negative consequences to increase its pressure. We were concerned that making players slow down would damage their immersion—the feeling of being engaged with a game. However, testing showed no loss of immersion through the use of nudges to reduce exertion. Players reported that the nudges and shoves motivated them to slow down when they were over-exerting, and fit naturally into the games.
Resumo:
Biofilms are the primary cause of clinical bacterial infections and are impervious to typical amounts of antibiotics, necessitating very high doses for treatment. Therefore, it is highly desirable to develop new alternate methods of treatment that can complement or replace existing approaches using significantly lower doses of antibiotics. Current standards for studying biofilms are based on end-point studies that are invasive and destroy the biofilm during characterization. This dissertation presents the development of a novel real-time sensing and treatment technology to aid in the non-invasive characterization, monitoring and treatment of bacterial biofilms. The technology is demonstrated through the use of a high-throughput bifurcation based microfluidic reactor that enables simulation of flow conditions similar to indwelling medical devices. The integrated microsystem developed in this work incorporates the advantages of previous in vitro platforms while attempting to overcome some of their limitations. Biofilm formation is extremely sensitive to various growth parameters that cause large variability in biofilms between repeated experiments. In this work we investigate the use of microfluidic bifurcations for the reduction in biofilm growth variance. The microfluidic flow cell designed here spatially sections a single biofilm into multiple channels using microfluidic flow bifurcation. Biofilms grown in the bifurcated device were evaluated and verified for reduced biofilm growth variance using standard techniques like confocal microscopy. This uniformity in biofilm growth allows for reliable comparison and evaluation of new treatments with integrated controls on a single device. Biofilm partitioning was demonstrated using the bifurcation device by exposing three of the four channels to various treatments. We studied a novel bacterial biofilm treatment independent of traditional antibiotics using only small molecule inhibitors of bacterial quorum sensing (analogs) in combination with low electric fields. Studies using the bifurcation-based microfluidic flow cell integrated with real-time transduction methods and macro-scale end-point testing of the combination treatment showed a significant decrease in biomass compared to the untreated controls and well-known treatments such as antibiotics. To understand the possible mechanism of action of electric field-based treatments, fundamental treatment efficacy studies focusing on the effect of the energy of the applied electrical signal were performed. It was shown that the total energy and not the type of the applied electrical signal affects the effectiveness of the treatment. The linear dependence of the treatment efficacy on the applied electrical energy was also demonstrated. The integrated bifurcation-based microfluidic platform is the first microsystem that enables biofilm growth with reduced variance, as well as continuous real-time threshold-activated feedback monitoring and treatment using low electric fields. The sensors detect biofilm growth by monitoring the change in impedance across the interdigitated electrodes. Using the measured impedance change and user inputs provided through a convenient and simple graphical interface, a custom-built MATLAB control module intelligently switches the system into and out of treatment mode. Using this self-governing microsystem, in situ biofilm treatment based on the principles of the bioelectric effect was demonstrated by exposing two of the channels of the integrated bifurcation device to low doses of antibiotics.
Resumo:
Calcium-activated potassium channels are a large family of potassium channels that are found throughout the central nervous system and in many other cell types. These channels are activated by rises in cytosolic calcium largely in response to calcium influx via voltage-gated calcium channels that open during action potentials. Activation of these potassium channels is involved in the control of a number of physiological processes from the firing properties of neurons to the control of transmitter release. These channels form the target for modulation for a range of neurotransmitters and have been implicated in the pathogenesis of neurological and psychiatric disorders. Here the authors summarize the varieties of calcium-activated potassium channels present in central neurons and their defining molecular and biophysical properties.