929 resultados para Lattice theory - Computer programs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many culturally and linguistically diverse (CLD) students with specific learning disabilities (SLD) struggle with the writing process. Particularly, they have difficulties developing and expanding ideas, organizing and elaborating sentences, and revising and editing their compositions (Graham, Harris, & Larsen, 2001; Myles, 2002). Computer graphic organizers offer a possible solution to assist them in their writing. This study investigated the effects of a computer graphic organizer on the persuasive writing compositions of Hispanic middle school students with SLD. A multiple baseline design across subjects was used to examine its effects on six dependent variables: number of arguments and supporting details, number and percentage of transferred arguments and supporting details, planning time, writing fluency, syntactical maturity (measured by T-units, the shortest grammatical sentence without fragments), and overall organization. Data were collected and analyzed throughout baseline and intervention. Participants were taught persuasive writing and the writing process prior to baseline. During baseline, participants were given a prompt and asked to use paper and pencil to plan their compositions. A computer was used for typing and editing. Intervention required participants to use a computer graphic organizer for planning and then a computer for typing and editing. The planning sheets and written composition were printed and analyzed daily along with the time each participant spent on planning. The use of computer graphic organizers had a positive effect on the planning and persuasive writing compositions. Increases were noted in the number of supporting details planned, percentage of supporting details transferred, planning time, writing fluency, syntactical maturity in number of T-units, and overall organization of the composition. Minimal to negligible increases were noted in the mean number of arguments planned and written. Varying effects were noted in the percent of transferred arguments and there was a decrease in the T-unit mean length. This study extends the limited literature on the effects of computer graphic organizers as a prewriting strategy for Hispanic students with SLD. In order to fully gauge the potential of this intervention, future research should investigate the use of different features of computer graphic organizer programs, its effects with other writing genres, and different populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Type systems for secure information flow aim to prevent a program from leaking information from H (high) to L (low) variables. Traditionally, bisimulation has been the prevalent technique for proving the soundness of such systems. This work introduces a new proof technique based on stripping and fast simulation, and shows that it can be applied in a number of cases where bisimulation fails. We present a progressive development of this technique over a representative sample of languages including a simple imperative language (core theory), a multiprocessing nondeterministic language, a probabilistic language, and a language with cryptographic primitives. In the core theory we illustrate the key concepts of this technique in a basic setting. A fast low simulation in the context of transition systems is a binary relation where simulating states can match the moves of simulated states while maintaining the equivalence of low variables; stripping is a function that removes high commands from programs. We show that we can prove secure information flow by arguing that the stripping relation is a fast low simulation. We then extend the core theory to an abstract distributed language under a nondeterministic scheduler. Next, we extend to a probabilistic language with a random assignment command; we generalize fast simulation to the setting of discrete time Markov Chains, and prove approximate probabilistic noninterference. Finally, we introduce cryptographic primitives into the probabilistic language and prove computational noninterference, provided that the underling encryption scheme is secure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research involves the design, development, and theoretical demonstration of models resulting in integrated misbehavior resolution protocols for ad hoc networked devices. Game theory was used to analyze strategic interaction among independent devices with conflicting interests. Packet forwarding at the routing layer of autonomous ad hoc networks was investigated. Unlike existing reputation based or payment schemes, this model is based on repeated interactions. To enforce cooperation, a community enforcement mechanism was used, whereby selfish nodes that drop packets were punished not only by the victim, but also by all nodes in the network. Then, a stochastic packet forwarding game strategy was introduced. Our solution relaxed the uniform traffic demand that was pervasive in other works. To address the concerns of imperfect private monitoring in resource aware ad hoc networks, a belief-free equilibrium scheme was developed that reduces the impact of noise in cooperation. This scheme also eliminated the need to infer the private history of other nodes. Moreover, it simplified the computation of an optimal strategy. The belief-free approach reduced the node overhead and was easily tractable. Hence it made the system operation feasible. Motivated by the versatile nature of evolutionary game theory, the assumption of a rational node is relaxed, leading to the development of a framework for mitigating routing selfishness and misbehavior in Multi hop networks. This is accomplished by setting nodes to play a fixed strategy rather than independently choosing a rational strategy. A range of simulations was carried out that showed improved cooperation between selfish nodes when compared to older results. Cooperation among ad hoc nodes can also protect a network from malicious attacks. In the absence of a central trusted entity, many security mechanisms and privacy protections require cooperation among ad hoc nodes to protect a network from malicious attacks. Therefore, using game theory and evolutionary game theory, a mathematical framework has been developed that explores trust mechanisms to achieve security in the network. This framework is one of the first steps towards the synthesis of an integrated solution that demonstrates that security solely depends on the initial trust level that nodes have for each other.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The challenge that continues to face HRD is how to integrate real concerns for diversity into programs, practices, and research. Critical race theory was used as a lens to examine work on diversity published in Human Resource Development Quarterly (HRDQ). Eight publications were selected and analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teacher education programs do not sufficiently prepare White teachers to work with Black and Latino students in disciplinary alternative schools. From a critical race theory in education perspective, prepared White teachers are aware of their own ethnocentrism and, subsequently, develop anti-racist pedagogy and curricula.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In his study - File Control: The Heart Of Business Computer Management - William G. O'Brien, Assistant Professor, The School of Hospitality Management at Florida International University, initially informs you: “Even though computers are an everyday part of the hospitality industry, many managers lack the knowledge and experience to control and protect the files in these systems. The author offers guidelines which can minimize or prevent damage to the business as a whole.” Our author initially opens this study with some anecdotal instances illustrating the failure of hospitality managers to exercise due caution with regard to computer supported information systems inside their restaurants and hotels. “Of the three components that make up any business computer system (data files, programs, and hard-ware), it is files that are most important, perhaps irreplaceable, to the business,” O’Brien informs you. O’Brien breaks down the noun, files, into two distinct categories. They are, the files of extrinsic value, and its counterpart the files of intrinsic value. An example of extrinsic value files would be a restaurant’s wine inventory. “As sales are made and new shipments are received, the computer updates the file,” says O’Brien. “This information might come directly from a point-of-sale terminal or might be entered manually by an employee,” he further explains. On the intrinsic side of the equation, O’Brien wants you to know that the information itself is the valuable part of this type of file. Its value is over and above the file’s informational purpose as a pragmatic business tool, as it is in inventory control. “The information is money in the legal sense For instance, figures moved about in banking system computers do not represent dollars; they are dollars,” O’Brien explains. “If the record of a dollar amount is erased from all computer files, then that money ceases to exist,” he warns. This type of information can also be bought and sold, such as it is in customer lists to advertisers. Files must be protected O’Brien stresses. “File security requires a systematic approach,” he discloses. O’Brien goes on to explain important elements to consider when evaluating file information. File back-up is also an important factor to think about, along with file storage/safety concerns. “Sooner or later, every property will have its fire, flood, careless mistake, or disgruntled employee,” O’Brien closes. “…good file control can minimize or prevent damage to the business as a whole.”

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Career Academy instructors' technical literacy is vital to the academic success of students. This nonexperimental ex post facto study examined the relationships between the level of technical literacy of instructors in career academies and student academic performance. It was also undertaken to explore the relationship between the pedagogical training of instructors and the academic performance of students. ^ Out of a heterogeneous population of 564 teachers in six targeted schools, 136 teachers (26.0 %) responded to an online survey. The survey was designed to gather demographic and teaching experience data. Each demographic item was linked by researchers to teachers' technology use in the classroom. Student achievement was measured by student learning gains as assessed by the reading section of the FCAT from the previous to the present school year. ^ Linear and hierarchical regressions were conducted to examine the research questions. To clarify the possibility of teacher gender and teacher race/ethnic group differences by research variable, a series of one-way ANOVAs were conducted. As revealed by the ANOVA results, there were not statistically significant group differences in any of the research variables by teacher gender or teacher race/ethnicity. Greater student learning gains were associated with greater teacher technical expertise integrating computers and technology into the classroom, even after controlling for teacher attitude towards computers. Neither teacher attitude toward technology integration nor years of experience in integrating computers into the curriculum significantly predicted student learning gains in the regression models. ^ Implications for HRD theory, research, and practice suggest that identifying teacher levels of technical literacy may help improve student academic performance by facilitating professional development strategies and new parameters for defining highly qualified instructors with 21st century skills. District professional development programs can benefit by increasing their offerings to include more computer and information communication technology courses. Teacher preparation programs can benefit by including technical literacy as part of their curriculum. State certification requirements could be expanded to include formal surveys to assess teacher use of technology.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Institutions have implemented many campus interventions to address student persistence/retention, one of which is Early Warning Systems (EWS). However, few research studies show evidence of interventions that incorporate noncognitive factors/skills, and psychotherapy/psycho-educational processes in the EWS. A qualitative study (phenomenological interview and document analysis) of EWS at both a public and private 4-year Florida university was conducted to explore EWS through the eyes of the administrators of the ways administrators make sense of students' experiences and the services they provide and do not provide to assist students. Administrators' understanding of noncognitive factors and the executive skills subset and their contribution to retention and the executive skills development of at-risk students were also explored. Hossler and Bean's multiple retention lenses theory/paradigms and Perez's retention strategies were used to guide the study. Six administrators from each institution who oversee and/or assist with EWS for first time in college undergraduate students considered academically at-risk for attrition were interviewed. Among numerous findings, at Institution X: EWS was infrequently identified as a service, EWS training was not conducted, numerous cognitive and noncognitive issues/deficits were identified for students, and services/critical departments such as EWS did not work together to share students' information to benefit students. Assessment measures were used to identify students' issues/deficits; however, they were not used to assess, track, and monitor students' issues/deficits. Additionally, the institution's EWS did address students' executive skills function beyond time management and organizational skills, but did not address students' psychotherapy/psycho-educational processes. Among numerous findings, at Institution Y: EWS was frequently identified as a service, EWS training was not conducted, numerous cognitive and noncognitive issues/deficits were identified for students, and services/critical departments such as EWS worked together to share students' information to benefit students. Assessment measures were used to identify, track, and monitor students' issues/deficits; however, they were not used to assess students' issues/deficits. Additionally, the institution's EWS addressed students' executive skills function beyond time management and organizational skills, and psychotherapy/psycho-educational processes. Based on the findings, Perez's retention strategies were not utilized in EWS at Institution X, yet were collectively utilized in EWS at Institution Y, to achieve Hossler and Bean's retention paradigms. Future research could be designed to test the link between engaging in the specific promising activities identified in this research (one-to-one coaching, participation in student success workshops, academic contracts, and tutoring) and student success (e.g., higher GPA, retention). Further, because this research uncovered some concern with how to best handle students with physical and psychological disabilities, future research could link these same promising strategies for improving student performance for example among ADHD students or those with clinical depression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine the knowledge and use of critical thinking teaching strategies by full-time and part-time faculty in Associate Degree Nursing (ADN) programs. Sander's CTI (1992) instrument was adapted for this study and pilottested prior to the general administration to ADN faculty in Southeast Florida. This modified instrument, now termed the Burroughs Teaching Strategy Inventory (BTSI), returned reliability estimates (Cronbach alphas of .71, .74, and .82 for the three constructs) comparable to the original instrument. The BTSI was administered to 113 full-time and part-time nursing faculty in three community college nursing programs. The response rate was 92% for full-time faculty (n = 58) and 61 % for part-time faculty (n = 55). The majority of participants supported a combined definition of critical thinking in nursing which represented a composite of thinking skills that included reflective thinking, assessing alternative viewpoints, and the use of problem-solving. Full-time and part-time faculty used different teaching strategies. Fulltime faculty most often used multiple-choice exams and lecture while part-time faculty most frequently used discussion within their classes. One possible explanation for specific strategy choices and differences might be that full-time faculty taught predominately in theory classes where certain strategies would be more appropriate and part-time faculty taught predominately clinical classes. Both faculty types selected written nursing care plans as the second most effective critical thinking strategy. Faculty identified several strategies as being effective in teaching critical thinking. These strategies included discussion, case studies, higher order questioning, and concept analysis. These however, were not always the strategies that were used in either the classroom or clinical setting. Based on this study, the author recommends that if the profession continues to stress critical thinking as a vital component of practice, nursing faculty should receive education in appropriate critical teaching strategies. Both in-service seminars and workshops could be used to further the knowledge and use of critical thinking strategies by faculty. Qualitative research should be done to determine why nursing faculty use self-selected teaching strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulations suggest that photomixing in resonant laser-assisted field emission could be used to generate and detect signals from DC to 100 THz. It is the objective of this research to develop a system to efficiently couple the microwave signals generated on an emitting tip by optical mixing. Four different methods for coupling are studied. Tapered Goubau line is found to be the most suitable. Goubau line theory is reviewed, and programs are written to determine loss on the line. From this, Goubau tapers are designed that have a 1:100 bandwidth. These tapers are finally simulated using finite difference time domain, to find the optimum design parameters. Tapered Goubau line is an effective method for coupling power from the field emitting tip. It has large bandwidth, and acceptable loss. Another important consideration is that it is the easiest to manufacture of the four possibilities studied, an important quality for any prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proofs by induction are central to many computer science areas such as data structures, theory of computation, programming languages, program efficiency-time complexity, and program correctness. Proofs by induction can also improve students’ understanding of and performance with computer science concepts such as programming languages, algorithm design, and recursion, as well as serve as a medium for teaching them. Even though students are exposed to proofs by induction in many courses of their curricula, they still have difficulties understanding and performing them. This impacts the whole course of their studies, since proofs by induction are omnipresent in computer science. Specifically, students do not gain conceptual understanding of induction early in the curriculum and as a result, they have difficulties applying it to more advanced areas later on in their studies. The goal of my dissertation is twofold: 1. identifying sources of computer science students’ difficulties with proofs by induction, and 2. developing a new approach to teaching proofs by induction by way of an interactive and multimodal electronic book (e-book). For the first goal, I undertook a study to identify possible sources of computer science students’ difficulties with proofs by induction. Its results suggest that there is a close correlation between students’ understanding of inductive definitions and their understanding and performance of proofs by induction. For designing and developing my e-book, I took into consideration the results of my study, as well as the drawbacks of the current methodologies of teaching proofs by induction for computer science. I designed my e-book to be used as a standalone and complete educational environment. I also conducted a study on the effectiveness of my e-book in the classroom. The results of my study suggest that, unlike the current methodologies of teaching proofs by induction for computer science, my e-book helped students overcome many of their difficulties and gain conceptual understanding of proofs induction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA's behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many culturally and linguistically diverse (CLD) students with specific learning disabilities (SLD) struggle with the writing process. Particularly, they have difficulties developing and expanding ideas, organizing and elaborating sentences, and revising and editing their compositions (Graham, Harris, & Larsen, 2001; Myles, 2002). Computer graphic organizers offer a possible solution to assist them in their writing. This study investigated the effects of a computer graphic organizer on the persuasive writing compositions of Hispanic middle school students with SLD. A multiple baseline design across subjects was used to examine its effects on six dependent variables: number of arguments and supporting details, number and percentage of transferred arguments and supporting details, planning time, writing fluency, syntactical maturity (measured by T-units, the shortest grammatical sentence without fragments), and overall organization. Data were collected and analyzed throughout baseline and intervention. Participants were taught persuasive writing and the writing process prior to baseline. During baseline, participants were given a prompt and asked to use paper and pencil to plan their compositions. A computer was used for typing and editing. Intervention required participants to use a computer graphic organizer for planning and then a computer for typing and editing. The planning sheets and written composition were printed and analyzed daily along with the time each participant spent on planning. The use of computer graphic organizers had a positive effect on the planning and persuasive writing compositions. Increases were noted in the number of supporting details planned, percentage of supporting details transferred, planning time, writing fluency, syntactical maturity in number of T-units, and overall organization of the composition. Minimal to negligible increases were noted in the mean number of arguments planned and written. Varying effects were noted in the percent of transferred arguments and there was a decrease in the T-unit mean length. This study extends the limited literature on the effects of computer graphic organizers as a prewriting strategy for Hispanic students with SLD. In order to fully gauge the potential of this intervention, future research should investigate the use of different features of computer graphic organizer programs, its effects with other writing genres, and different populations.