8 resultados para Digital communication models
em DRUM (Digital Repository at the University of Maryland)
Resumo:
The paradigm shift from traditional print literacy to the postmodern fragmentation, nonlinearity, and multimodality of writing for the Internet is realized in Gregory L. Ulmer’s electracy theory. Ulmer’s open invitation to continually invent the theory has resulted in the proliferation of relays, or weak models, by electracy advocates for understanding and applying the theory. Most relays, however, remain theoretical rather than practical for the writing classroom, and electracy instruction remains rare, potentially hindering the theory’s development. In this dissertation, I address the gap in electracy praxis by adapting, developing, and remixing relays for a functional electracy curriculum with first-year writing students in the Virginia Community College System as the target audience. I review existing electracy relays, pedagogical applications, and assessment practices – Ulmer’s and those of electracy advocates – before introducing my own relays, which take the form of modules. My proposed relay modules are designed for adaptability with the goals of introducing digital natives to the logic of new media and guiding instructors to possible implementations of electracy. Each module contains a justification, core competencies and learning outcomes, optional readings, an assignment with supplemental exercises, and assessment criteria. My Playlist, Transduction, and (Sim)ulation relays follow sound backward curricular design principles and emphasize core hallmarks of electracy as juxtaposed alongside literacy. This dissertation encourages the instruction of new media in Ulmer’s postmodern apparatus in which student invention via the articulation of fragments from various semiotic modes stems from and results in new methodologies for and understandings of digital communication.
Resumo:
In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.
Resumo:
In 1937 Lisa Sergio, "The Golden Voice" of fascist broadcasting from Rome, fled Italy for the United States. Though her mother was American, Sergio was classified as an enemy alien once the United States entered World War II. Yet Sergio became a U.S. citizen in 1944 and built a successful career in radio, working first at NBC and then WQXR in New York City in the days when women's voices were not thought to be appropriate for news or "serious" programming. When she was blacklisted as a communist in the early 1950s, Sergio compensated for the loss of radio employment by becoming principally an author and lecturer in Washington, D.C., until her death in 1989. This dissertation, based on her personal papers, is the first study of Sergio's American mass communication career. It points out the personal, political and social obstacles she faced as a woman in her 52-year career as a commentator on varied aspects of world affairs, religion and feminism. This study includes an examination of the FBI investigations of Sergio and the anti-communist campaigns conducted against her. It concludes that Sergio's success as a public communicator was predicated on both her unusual talents and her ability to transform her public image to reflect ideal American values of womanhood in shifting political climates.
Resumo:
Gemstone Team AUDIO (Assessing and Understanding Deaf Individuals' Occupations)
Resumo:
Gemstone Team FLIP (File Lending in Proximity)
Resumo:
Our research was conducted to improve the timeliness, coordination, and communication during the detection, investigation and decision-making phases of the response to an aerosolized anthrax attack in the metropolitan Washington, DC, area with the goal of reducing casualties. Our research gathered information of the current response protocols through an extensive literature review and interviews with relevant officials and experts in order to identify potential problems that may exist in various steps of the detection, investigation, and response. Interviewing officials from private and government sector agencies allowed the development of a set of models of interactions and a communication network to identify discrepancies and redundancies that would elongate the delay time in initiating a public health response. In addition, we created a computer simulation designed to model an aerosol spread using weather patterns and population density to identify an estimated population of infected individuals within a target region depending on the virulence and dimensions of the weaponized spores. We developed conceptual models in order to design recommendations that would be presented to our collaborating contacts and agencies that would use such policy and analysis interventions to improve upon the overall response to an aerosolized anthrax attack, primarily through changes to emergency protocol functions and suggestions of technological detection and monitoring response to an aerosolized anthrax attack.
Resumo:
Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.
The tithe: Public research university STEM faculty perspectives on sponsored research indirect costs
Resumo:
This study sought to understand the phenomenon of faculty involvement in indirect cost under-recovery. The focus of the study was on public research university STEM (science, technology, engineering and mathematics) faculty, and their perspectives on, and behavior towards, a higher education fiscal policy. The explanatory scheme was derived from anthropological theory, and incorporated organizational culture, faculty socialization, and political bargaining models in the conceptual framework. This study drew on two key assumptions. The first assumption was that faculty understanding of, and behavior toward, indirect cost recovery represents values, beliefs, and choices drawn from the distinct professional socialization and distinct culture of faculty. The second assumption was that when faculty and institutional administrators are in conflict over indirect cost recovery, the resultant formal administrative decision comes about through political bargaining over critical resources. The research design was a single site, qualitative case study with a focus on learning the meaning of the phenomenon as understood by the informants. In this study the informants were tenured and tenure track research university faculty in the STEM fields who were highly successful at obtaining Federal sponsored research funds, with individual sponsored research portfolios of at least one million dollars. The data consisted of 11 informant interviews, bolstered by documentary evidence. The findings indicated that faculty socialization and organizational culture were the most dominant themes, while political bargaining emerged as significantly less prominent. Public research university STEM faculty are most concerned about the survival of their research programs and the discovery facilitated by their research programs. They resort to conjecture when confronted by the issue of indirect cost recovery. The findings direct institutional administrators to consider less emphasis on compliance and hierarchy when working with expert professionals such as science faculty. Instead a more effective focus might be on communication and clarity in budget processes and organizational decision-making, and a concentration on critical administrative support that can relieve faculty administrative burdens. For higher education researchers, the findings suggest that we need to create more sophisticated models to help us understand organizations dependent on expert professionals.