22 resultados para Peer-to-peer architecture (Computer networks)
em Digital Commons at Florida International University
Resumo:
The purpose of this thesis was to develop an efficient routing protocol which would provide mobility support to the mobile devices roaming within a network. The routing protocol need to be compatible with the existing internet architecture. The routing protocol proposed here is based on the Mobile IP routing protocol and could solve some of the problems existing in current Mobile IP implementation e.g. ingress filtering problem. By implementing an efficient timeout mechanism and introducing Paging mechanism to the wireless network, the protocol minimizes the number of control messages sent over the network. The implementation of the system is primarily done on three components: 1) Mobile devices that need to gain access to the network, 2) Router which would be providing roaming support to the mobile devices and 3) Database server providing basic authentication services on the system. As a result, an efficient IP routing protocol is developed which would provide seamless mobility to the mobile devices on the network.
Resumo:
Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^
Resumo:
Minimum Student Performance Standards in Computer Literacy and Science were passed by the Florida Legislature through the Educational Reform Act of 1983. This act mandated that all Florida high school graduates receive training in computer literacy. Schools and school systems were charged with the task of determining the best methods to deliver this instruction to their students. The scope of this study is to evaluate one school's response to the state of Florida's computer literacy mandate. The study was conducted at Miami Palmetto Senior High School, located in Dade County, Florida. The administration of Miami Palmetto Senior High School chose to develop and implement a new program to comply with the state mandate - integrating computer literacy into the existing biology curriculum. The study evaluated the curriculum to determine if computer literacy could be integrated successfully and meet both the biology and computer literacy objectives. The findings in this study showed that there were no significant differences between biology scores of the students taking the integrated curriculum and those taking a traditional curriculum of biology. Student in the integrated curriculum not only met the biology objectives as well as those in the traditional curriculum, they also successfully completed the intended objectives for computer literacy. Two sets of objectives were successfully completed in the integrated classes in the same amount of time used to complete one set of objectives in the traditional biology classes. Therefore, integrated curriculum was the more efficient means of meeting the intended objectives of both biology and computer literacy.
Resumo:
The primary purpose of this investigation is to study the motives of community college faculty who decide not to use computers in teaching. In spite of the fact that many of the environmental blocks that would otherwise inhibit the use of the computers have been eliminated at many institutions, many faculty do not use a computer beyond its word-processing function. For the purpose of the study non-adoption of computers in teaching is defined as not using computers for more than word-processing. ^ The issues in the literature focus on resistance and assume a pro-innovation and pro-adoption bias. Previous research on the questions is primarily surveys with narrowly focused assumptions. This qualitative research directly asks the participants about their feelings, beliefs, attitudes, experiences, and behaviors in regard to computers in teaching. Through the interview process a number of other correlated issues emerge. ^ The investigation was conducted at Miami-Dade Community College, a large urban multicampus institution, in Miami-Dade, Florida. It was conducted through a series of in-depth phenomenological interviews. There were nine interviews; eight within the profile; two were pilots; and one was an extreme opposite of the profile. Each participant was interviewed three times for about 45 minutes. ^ The results indicate that the computer conflicts with the participants' values in regard to their teaching and their beliefs in regard to the nature of knowledge, learning, and the relationship that they wish to maintain with students. Computers require significant changes in the values, beliefs, and consequent behaviors. These are changes that the participants are not willing to make without overwhelming evidence that they are worth the sacrifice. For the participants, this worth is only definable as it positively improves learning. For even the experts the evidence is not there. Unlike the innovator, the high end computer user, these participants are not willing to adopt the computer on faith. ^
Resumo:
Many culturally and linguistically diverse (CLD) students with specific learning disabilities (SLD) struggle with the writing process. Particularly, they have difficulties developing and expanding ideas, organizing and elaborating sentences, and revising and editing their compositions (Graham, Harris, & Larsen, 2001; Myles, 2002). Computer graphic organizers offer a possible solution to assist them in their writing. This study investigated the effects of a computer graphic organizer on the persuasive writing compositions of Hispanic middle school students with SLD. A multiple baseline design across subjects was used to examine its effects on six dependent variables: number of arguments and supporting details, number and percentage of transferred arguments and supporting details, planning time, writing fluency, syntactical maturity (measured by T-units, the shortest grammatical sentence without fragments), and overall organization. Data were collected and analyzed throughout baseline and intervention. Participants were taught persuasive writing and the writing process prior to baseline. During baseline, participants were given a prompt and asked to use paper and pencil to plan their compositions. A computer was used for typing and editing. Intervention required participants to use a computer graphic organizer for planning and then a computer for typing and editing. The planning sheets and written composition were printed and analyzed daily along with the time each participant spent on planning. The use of computer graphic organizers had a positive effect on the planning and persuasive writing compositions. Increases were noted in the number of supporting details planned, percentage of supporting details transferred, planning time, writing fluency, syntactical maturity in number of T-units, and overall organization of the composition. Minimal to negligible increases were noted in the mean number of arguments planned and written. Varying effects were noted in the percent of transferred arguments and there was a decrease in the T-unit mean length. This study extends the limited literature on the effects of computer graphic organizers as a prewriting strategy for Hispanic students with SLD. In order to fully gauge the potential of this intervention, future research should investigate the use of different features of computer graphic organizer programs, its effects with other writing genres, and different populations.
Resumo:
Writing is an academic skill critical to students in today's schools as it serves as a predominant means for demonstrating knowledge during school years (Graham, 2008). However, for many students with Specific Learning Disabilities (SLD), learning to write is a challenging, complex process (Lane, Graham, Harris, & Weisenbach, 2006). Students SLD have substantial writing challenges related to the nature of their disability (Mayes & Calhoun, 2005). ^ This study investigated the effects of computer graphic organizer software on the narrative writing compositions of four, fourth- and fifth-grade, elementary-level boys with SLD. A multiple baseline design across subjects was used to explore the effects of the computer graphic organizer software on four dependent variables: total number of words, total planning time, number of common story elements, and overall organization. ^ Prior to baseline, participants were taught the fundamentals of narrative writing. Throughout baseline and intervention, participants were read a narrative writing prompt and were allowed up to 10 minutes to plan their writing, followed by 15 minutes for writing, and 5 minutes of editing. During baseline, all planning was done using paper and pencil. During intervention, planning was done on the computer using a graphic organizer developed from the software program Kidspiration 3.0 (2011). All compositions were written and editing was done using paper and pencil during baseline and intervention. ^ The results of this study indicated that to varying degrees computer graphic organizers had a positive effect on the narrative writing abilities of elementary aged students with SLD. Participants wrote more words (from 54.74 to 96.60 more), planned for longer periods of time (from 4.50 to 9.50 more minutes), and included more story elements in their compositions (from 2.00 to 5.10 more out of a possible 6). There were nominal to no improvements in overall organization across the 4 participants. ^ The results suggest that teachers of students with SLD should considering use computer graphic organizers in their narrative writing instruction, perhaps in conjunction with remedial writing strategies. Future investigations can include other types of writing genres, other stages of writing, participants with varied demographics and their use combined with remedial writing instruction. ^
Resumo:
Compact thermal-fluid systems are found in many industries from aerospace to microelectronics where a combination of small size, light weight, and high surface area to volume ratio fluid networks are necessary. These devices are typically designed with fluid networks consisting of many small parallel channels that effectively pack a large amount of heat transfer surface area in a very small volume but do so at the cost of increased pumping power requirements. ^ To offset this cost the use of a branching fluid network for the distribution of coolant within a heat sink is investigated. The goal of the branch design technique is to minimize the entropy generation associated with the combination of viscous dissipation and convection heat transfer experienced by the coolant in the heat sink while maintaining compact high heat transfer surface area to volume ratios. ^ The derivation of Murray's Law, originally developed to predict the geometry of physiological transport systems, is extended to heat sink designs which minimze entropy generation. Two heat sink designs at different scales are built, and tested experimentally and analytically. The first uses this new derivation of Murray's Law. The second uses a combination of Murray's Law and Constructal Theory. The results of the experiments were used to verify the analytical and numerical models. These models were then used to compare the performance of the heat sink with other compact high performance heat sink designs. The results showed that the techniques used to design branching fluid networks significantly improves the performance of active heat sinks. The design experience gained was then used to develop a set of geometric relations which optimize the heat transfer to pumping power ratio of a single cooling channel element. Each element can be connected together using a set of derived geometric guidelines which govern branch diameters and angles. The methodology can be used to design branching fluid networks which can fit any geometry. ^
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Career Academy instructors' technical literacy is vital to the academic success of students. This nonexperimental ex post facto study examined the relationships between the level of technical literacy of instructors in career academies and student academic performance. It was also undertaken to explore the relationship between the pedagogical training of instructors and the academic performance of students. ^ Out of a heterogeneous population of 564 teachers in six targeted schools, 136 teachers (26.0 %) responded to an online survey. The survey was designed to gather demographic and teaching experience data. Each demographic item was linked by researchers to teachers' technology use in the classroom. Student achievement was measured by student learning gains as assessed by the reading section of the FCAT from the previous to the present school year. ^ Linear and hierarchical regressions were conducted to examine the research questions. To clarify the possibility of teacher gender and teacher race/ethnic group differences by research variable, a series of one-way ANOVAs were conducted. As revealed by the ANOVA results, there were not statistically significant group differences in any of the research variables by teacher gender or teacher race/ethnicity. Greater student learning gains were associated with greater teacher technical expertise integrating computers and technology into the classroom, even after controlling for teacher attitude towards computers. Neither teacher attitude toward technology integration nor years of experience in integrating computers into the curriculum significantly predicted student learning gains in the regression models. ^ Implications for HRD theory, research, and practice suggest that identifying teacher levels of technical literacy may help improve student academic performance by facilitating professional development strategies and new parameters for defining highly qualified instructors with 21st century skills. District professional development programs can benefit by increasing their offerings to include more computer and information communication technology courses. Teacher preparation programs can benefit by including technical literacy as part of their curriculum. State certification requirements could be expanded to include formal surveys to assess teacher use of technology.^
Resumo:
This presentation will show how a grassroots initiative has budded into the Florida International University (FIU) Libraries being an instrumental part of online learning. It will describe some of the marketing and outreach efforts that have been successful and share ideas on how to build alliances and networks with online faculty and students. Along with outreach efforts, the presentation will demonstrate some of the successful tools used to meet the needs of online students. Some of the these tools include becoming embedded in courses, building course and program specific Libguides, using Adobe Connect to reach students, creating simple YouTube videos, and creating more professional videos with FIU Online. The presentation will conclude with sharing some tips on how to keep the workload manageable when distance-learning programs are growing at the same time as library budgets and resources are shrinking.
Resumo:
This work consists on the design and implementation of a complete monitored security system. Two computers make up the basic system: one computer is the transmitter and the other is the receiver. Both computers interconnect by modems. Depending on the status of the input sensors (magnetic contacts, motion detectors and others) the transmitter detects an alarm condition and sends a detailed report of the event via modem to the receiver computer.
Resumo:
Effective interaction with personal computers is a basic requirement for many of the functions that are performed in our daily lives. With the rapid emergence of the Internet and the World Wide Web, computers have become one of the premier means of communication in our society. Unfortunately, these advances have not become equally accessible to physically handicapped individuals. In reality, a significant number of individuals with severe motor disabilities, due to a variety of causes such as Spinal Cord Injury (SCI), Amyothrophic Lateral Sclerosis (ALS), etc., may not be able to utilize the computer mouse as a vital input device for computer interaction. The purpose of this research was to further develop and improve an existing alternative input device for computer cursor control to be used by individuals with severe motor disabilities. This thesis describes the development and the underlying principle for a practical hands-off human-computer interface based on Electromyogram (EMG) signals and Eye Gaze Tracking (EGT) technology compatible with the Microsoft Windows operating system (OS). Results of the software developed in this thesis show a significant improvement in the performance and usability of the EMG/EGT cursor control HCI.