24 resultados para Effective teaching -- Computer network resources
em Digital Commons at Florida International University
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
With the growing commercial importance of the Internet and the development of new real-time, connection-oriented services like IP-telephony and electronic commerce resilience is becoming a key issue in the design of TP-based networks. Two emerging technologies, which can accomplish the task of efficient information transfer, are Multiprotocol Label Switching (MPLS) and Differentiated Services. A main benefit of MPLS is the ability to introduce traffic-engineering concepts due to its connection-oriented characteristic. With MPLS it is possible to assign different paths for packets through the network. Differentiated services divides traffic into different classes and treat them differently, especially when there is a shortage of network resources. In this thesis, a framework was proposed to integrate the above two technologies and its performance in providing load balancing and improving QoS was evaluated. Simulation and analysis of this framework demonstrated that the combination of MPLS and Differentiated services is a powerful tool for QoS provisioning in IP networks.
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
The purpose of this study was to compare the characteristics of effective clinical and theory instructors as perceived by LPN/RN versus generic students in an associate degree nursing program.^ Data were collected from 508 students during the 1996-7 academic year from three NLN accredited associate degree nursing programs. The researcher developed instrument consisted of three parts: (a) Whitehead Characteristics of Effective Clinical Instructor Rating Scale, (b) Whitehead Characteristics of Effective Theory Instructor Rating Scale, and (c) Demographic Data Sheet. The items were listed under five major categories identified in the review of the literature: (a) interpersonal relationships, (b) personality traits, (c) teaching practices, (d) knowledge and experience, and (e) evaluation procedures. The instrument was administered to LPN/RN students in their first semester and to generic students in the third semester of an associate degree nursing program.^ Data was analyzed using a one factor mutivariate analysis of variance (MANOVA). Further t tests were carried out to explore for possible differences between type of student and by group. Crosstabulations of the demographic data were analyzed.^ There were no significant differences found between the LPN/RN versus generic students on their perceptions of either effective theory or effective clinical instructor characteristics. There were significant differences between groups on several of the individual items. There was no significant interaction between group and ethnicity or group and age on the five major categories for either of the two instruments. There was a significant main effect of ethnicity on several of the individual items.^ The differences between the means and standard deviations on both instruments were small, suggesting that all of the characteristics listed for effective theory and clinical instructors were important to both groups of students. Effective teaching behaviors, as indicated on the survey instruments, should be taught to students in graduate teacher education programs. These behaviors should also be discussed by faculty coordinators supervising adjunct faculty. Nursing educators in associate degree nursing programs should understand theories of adult learning and implement instructional strategies to enhance minority student success. ^
Resumo:
The effect of teaching method in physical education is an important issue and has been a concern of the expert teacher. Teachers are expected to create a model of teaching in their field; therefore, it is reasonable to question what is the effect of an alternative teaching method on student performance in physical education. This study explores whether teaching methods with advanced planning, behavior and belief in high enthusiasm, use of instructional strategies and evaluation, together termed a systematic teaching approach, used in a physical education activity would provide an effective environment for learning which supports student achievement in the psychomotor, cognitive, and affective domains. This study also investigated whether there was a difference in performance between students who were taught with a systematic teaching approach and students who were taught with the traditional teaching model. Information was collected using two performance skills, a written test, and one questionnaire. The 68 participants were randomly assigned into either an experimental group or a control group. Two teachers were assigned to either the experimental group or the control group. The teaching experiment took place at Tamsui Oxford University College in Taiwan and lasted eight weeks. ^ Research questions were analyzed using the t-test. Results indicated that a significant difference in students' performance was found between the experimental group and the control group on both the skill tests and the paper test. Analysis of student attitude toward their teacher and their course on the questionnaire indicated a significant difference between the experimental group and the control group. ^ The findings of this study imply that students who were taught with a systematic teaching style were significantly superior to students who were taught with the traditional model on these measures. This finding supports the contention that effective teaching in physical education is related to advanced planning, high enthusiasm, instructional strategy and evaluation and that all physical education teachers should implement these planning elements in the development of the teaching strategies. ^
Resumo:
Diet and physical activity patterns have been implicated as major factors in the increasing prevalence of childhood and adolescent obesity. It is estimated that between 16 and 33 percent of children and adolescents in the United States are overweight (CDC, 2000). Moreover, the CDC estimates that less than 50% of adolescents are physically active on a regular basis (CDC, 2003). Interventions must be focused to modify these behaviors. Facilitating the understanding of proper nutrition and need for physical activity among adolescents is the first step in preventing overweight and obesity and delaying the development of chronic diseases later in life (Dwyer, 2000). The purpose of this study was to compare the outcomes of students receiving one of two forms of education (both emphasizing diet and physical activity), to determine whether a computer based intervention (CBI) program using an interactive, animated CD-ROM would elicit a greater behavior change in comparison to a traditional didactic intervention (TDI) program. A convenience sample of 254 high school students aged 14-19 participated in the 6-month program. A pre-test post-test design was used, with follow-up measures taken at three months post-intervention. ^ No change was noted in total fat, saturated fat, fruit/vegetables, or fiber intake for any of the groups. There was also no change in perceived self-efficacy or perceived social support. Results did, however, indicate an increase in nutrition knowledge for both intervention groups (p<0.001). In addition, the CBI group demonstrated more positive and sustained behavior changes throughout the course of the study. These changes included a decrease in BMI (ppre/post<0.001, ppost/follow-up<0.001), number of meals skipped (ppre/post<0.001), and soda consumption (ppre/post=0.003, ppost/follow-up=0.03) and an increase in nutrition knowledge (ppre/post<0.001, ppre/follow-up <0.001), physical activity (ppre/post<0.05, p pre/follow-up<0.01), frequency of label reading (ppre/follow-up <0.0l) and in dairy consumption (ppre/post=0.03). The TDI group did show positive gains in some areas post intervention, however a return to baseline behavior was shown at follow-up. Findings of this study suggest that compared to traditional didactic teaching, computer-based nutrition and health education has greater potential to elicit change in knowledge and behavior as well as promote maintenance of the behavior change over time. ^
Resumo:
The focus of this study was to explain the extent to which theoretically effective teaching strategies taught in a course on generic instructional strategies are being implemented by teachers in their actual teaching practice. ^ A multivariate causal-comparative (ex-post-facto) design was used to answer the research question. A teacher observation protocol, the General Instructional Strategies Analysis (GISA) was constructed and used to assess the utilization of instructional strategies in the classroom. The data of this study also included open-ended field notes taken during observations. ^ Multivariate Analyses of Variance (MANOVA) was used to compare the teaching strategies (set, effective explanation, hands-on activity, cooperative learning activity, higher order questioning, closure) of the group who had taken a general instructional strategies course (N=36) and the group who had not (N=36). Results showed a statistically significant difference between the two groups. The group who had taken the course implemented these strategies more effectively in almost all categories of effective teaching. Follow-up univariate tests of the dependent variables showed significant differences between the two groups in five of the six areas (hands-on activity being an exception). A second MANOVA compared the two groups on the effective use of attending behaviors (teacher movement/eye contact/body language/physical space, brief verbal acknowledgements/voice inflection/modulation/pitch, use of visuals, prompting/probing, praise/feedback/rewards, wait-time I and II). Results also showed a multivariate difference between the two groups. Follow-up univariate tests on the related dependent variables showed that five of the six were significantly different between the two groups. The group who had taken the course implemented the strategies more effectively. An analysis of the field notes provided further evidence regarding the pervasiveness of these differences between the teaching practices of the two groups. ^ It was concluded that taking a course in general instructional strategies increases the utilization of effective strategies in the classroom by teachers. ^
Resumo:
What qualities, skills, and knowledge produce quality teachers? Many stake-holders in education argue that teacher quality should be measured by student achievement. This qualitative study shows that good teachers are multi-dimensional; their effectiveness cannot be represented by students' test scores alone. The purpose of this phenomenological study was to gain a deeper understanding of quality in teaching by examining the lived experiences of 10 winners or finalists of the Teacher of the Year (ToY) Award. Phenomenology describes individuals' daily experiences of phenomena, examines how these experiences are structured, and focuses analysis on the perspectives of the persons having the experience (Moustakas, 1994). This inquiry asked two questions: (a) How is teaching experienced by recognized as outstanding Teachers of the Year? and (b) How do ToYs feelings and perceptions about being good teachers provide insight, if any, about concepts such as pedagogical tact, teacher selfhood, and professional dispositions? Ten participants formed the purposive sample; the major data collection tool was semi-structured interviews (Patton, 1990; Seidman, 2006). Sixty to 90-minute interviews were conducted with each participant. Data also included the participants' ToY application essays. Data analysis included a three-phase process: description, reduction, interpretation. Findings revealed that the ToYs are dedicated, hard-working individuals. They exhibit behaviors, such as working beyond the school day, engaging in lifelong learning, and assisting colleagues to improve their practice. Working as teachers is their life's compass, guiding and wrapping them into meaningful and purposeful lives. Pedagogical tact, teacher selfhood, and professional dispositions were shown to be relevant, offering important insights into good teaching. Results indicate that for these ToYs, good teaching is experienced by getting through to students using effective and moral means; they are emotionally open, have a sense of the sacred, and they operate from a sense of intentionality. The essence of the ToYs teaching experience was their being properly engaged in their craft, embodying logical, psychological, and moral realms. Findings challenge current teacher effectiveness process-product orthodoxy which makes a causal connection between effective teaching and student test scores, and which assumes that effective teaching arises solely from and because of the actions of the teacher.
Resumo:
What qualities, skills, and knowledge produce quality teachers? Many stake-holders in education argue that teacher quality should be measured by student achievement. This qualitative study shows that good teachers are multi-dimensional; their effectiveness cannot be represented by students’ test scores alone. The purpose of this phenomenological study was to gain a deeper understanding of quality in teaching by examining the lived experiences of 10 winners or finalists of the Teacher of the Year (ToY) Award. Phenomenology describes individuals’ daily experiences of phenomena, examines how these experiences are structured, and focuses analysis on the perspectives of the persons having the experience (Moustakas, 1994). This inquiry asked two questions: (a) How is teaching experienced by recognized as outstanding Teachers of the Year? and (b) How do ToYs feelings and perceptions about being good teachers provide insight, if any, about concepts such as pedagogical tact, teacher selfhood, and professional dispositions? Ten participants formed the purposive sample; the major data collection tool was semi-structured interviews (Patton, 1990; Seidman, 2006). Sixty to 90-minute interviews were conducted with each participant. Data also included the participants’ ToY application essays. Data analysis included a three-phase process: description, reduction, interpretation. Findings revealed that the ToYs are dedicated, hard-working individuals. They exhibit behaviors, such as working beyond the school day, engaging in lifelong learning, and assisting colleagues to improve their practice. Working as teachers is their life’s compass, guiding and wrapping them into meaningful and purposeful lives. Pedagogical tact, teacher selfhood, and professional dispositions were shown to be relevant, offering important insights into good teaching. Results indicate that for these ToYs, good teaching is experienced by getting through to students using effective and moral means; they are emotionally open, have a sense of the sacred, and they operate from a sense of intentionality. The essence of the ToYs teaching experience was their being properly engaged in their craft, embodying logical, psychological, and moral realms. Findings challenge current teacher effectiveness process-product orthodoxy which makes a causal connection between effective teaching and student test scores, and which assumes that effective teaching arises solely from and because of the actions of the teacher.
Resumo:
Today, most conventional surveillance networks are based on analog system, which has a lot of constraints like manpower and high-bandwidth requirements. It becomes the barrier for today's surveillance network development. This dissertation describes a digital surveillance network architecture based on the H.264 coding/decoding (CODEC) System-on-a-Chip (SoC) platform. The proposed digital surveillance network architecture includes three major layers: software layer, hardware layer, and the network layer. The following outlines the contributions to the proposed digital surveillance network architecture. (1) We implement an object recognition system and an object categorization system on the software layer by applying several Digital Image Processing (DIP) algorithms. (2) For better compression ratio and higher video quality transfer, we implement two new modules on the hardware layer of the H.264 CODEC core, i.e., the background elimination module and the Directional Discrete Cosine Transform (DDCT) module. (3) Furthermore, we introduce a Digital Signal Processor (DSP) sub-system on the main bus of H.264 SoC platforms as the major hardware support system for our software architecture. Thus we combine the software and hardware platforms to be an intelligent surveillance node. Lab results show that the proposed surveillance node can dramatically save the network resources like bandwidth and storage capacity.
Resumo:
The present paper investigates post-Soviet non-state and state higher educational institutions in terms of students’ perceptions of school curriculum, quality of teaching, available educational resources and overall organization in their higher educational institutions.
Resumo:
An important issue of resource distribution is the fairness of the distribution. For example, computer network management wishes to distribute network resource fairly to its users. To describe the fairness of the resource distribution, a quantitative fairness score function was proposed in 1984 by Jain et al. The purpose of this paper is to propose a modified network sharing fairness function so that the users can be treated differently according to their priority levels. The mathematical properties are discussed. The proposed fairness score function keeps all the nice properties of and provides better performance when the network users have different priority levels.
Resumo:
This research investigated the effectiveness and efficiency of structured writing as compared to traditional nonstructured writing as a teaching and learning strategy in a training session for teachers.^ Structured writing is a method of identifying, interrelating, sequencing, and graphically displaying information on fields of a page or computer. It is an alternative for improving training and educational outcomes by providing an effective and efficient documentation methodology.^ The problem focuses upon the contradiction between: (a) the supportive research and theory to modify traditional methods of written documents and information presentation and (b) the existing paradigm to continue with traditional communication methods.^ A MANOVA was used to determine significant difference between a control and an experimental group in a posttest only experimental design. The experimental group received the treatment of structured writing materials during a training session. Two variables were analyzed. They were: (a) effectiveness; correct items on a posttest, and (b) efficiency; time spent on test.^ The quantitative data showed a difference for the experimental group on the two dependent variables. The experimental group completed the posttest in 2 minutes less time while scoring 1.5 more items correct. An interview with the training facilitators revealed that the structured writing materials were "user friendly." ^