36 resultados para Lattice theory - Computer programs
em Digital Commons at Florida International University
Resumo:
One of the major problems in the analysis of beams with Moment of Inertia varying along their length, is to find the Fixed End Moments, Stiffness, and Carry-Over Factors. In order to determine Fixed End Moments, it is necessary to consider the non-prismatic member as integrated by a large number of small sections with constant Moment of Inertia, and to find the M/EI values for each individual section. This process takes a lot of time from Designers and Structural Engineers. The object of this thesis is to design a computer program to simplify this repetitive process, obtaining rapidly and effectively the Final Moments and Shears in continuous non-prismatic Beams. For this purpose the Column Analogy and the Moment Distribution Methods of Professor Hardy Cross have been utilized as the principles toward the methodical computer solutions. The program has been specifically designed to analyze continuous beams of a maximum of four spans of any length, integrated by symmetrical members with rectangular cross sections and with rectilinear variation of the Moment of Inertia. Any load or combination of uniform and concentrated loads must be considered. Finally sample problems will be solved with the new Computer Program and with traditional systems, to determine the accuracy and applicability of the Program.
Resumo:
The rapid growth of the Internet and the advancements of the Web technologies have made it possible for users to have access to large amounts of on-line music data, including music acoustic signals, lyrics, style/mood labels, and user-assigned tags. The progress has made music listening more fun, but has raised an issue of how to organize this data, and more generally, how computer programs can assist users in their music experience. An important subject in computer-aided music listening is music retrieval, i.e., the issue of efficiently helping users in locating the music they are looking for. Traditionally, songs were organized in a hierarchical structure such as genre->artist->album->track, to facilitate the users’ navigation. However, the intentions of the users are often hard to be captured in such a simply organized structure. The users may want to listen to music of a particular mood, style or topic; and/or any songs similar to some given music samples. This motivated us to work on user-centric music retrieval system to improve users’ satisfaction with the system. The traditional music information retrieval research was mainly concerned with classification, clustering, identification, and similarity search of acoustic data of music by way of feature extraction algorithms and machine learning techniques. More recently the music information retrieval research has focused on utilizing other types of data, such as lyrics, user-access patterns, and user-defined tags, and on targeting non-genre categories for classification, such as mood labels and styles. This dissertation focused on investigating and developing effective data mining techniques for (1) organizing and annotating music data with styles, moods and user-assigned tags; (2) performing effective analysis of music data with features from diverse information sources; and (3) recommending music songs to the users utilizing both content features and user access patterns.
Resumo:
Underwater sound is very important in the field of oceanography where it is used for remote sensing in much the same way that radar is used in atmospheric studies. One way to mathematically model sound propagation in the ocean is by using the parabolic-equation method, a technique that allows range dependent environmental parameters. More importantly, this method can model sound transmission where the source emits either a pure tone or a short pulse of sound. Based on the parabolic approximation method and using the split-step Fourier algorithm, a computer model for underwater sound propagation was designed and implemented. This computer model differs from previous models in its use of the interactive mode, structured programming, modular design, and state-of-the-art graphics displays. In addition, the model maximizes the efficiency of computer time through synchronization of loosely coupled dual processors and the design of a restart capability. Since the model is designed for adaptability and for users with limited computer skills, it is anticipated that it will have many applications in the scientific community.
Resumo:
The Mini-Numerical Electromagnetic Code (MININEC) program, a PC-Compatible version of the powerful NEC program, is used to design a new type of reduced-size antenna. The validity of the program to model simple well-known antennas, such as dipoles and monopoles, is first shown. More complex geometries such as folded dipoles, and meander dipole antennas are also analysed using the program. The final design geometry of a meander folded dipole is characterized with MININEC, yielding results that serve as the basis for the practical construction of the antenna. Finally, the laboratory work with a prototype antenna is described, and practical results are presented.
Resumo:
The field of chemical kinetics is an exciting and active field. The prevailing theories make a number of simplifying assumptions that do not always hold in actual cases. Another current problem concerns a development of efficient numerical algorithms for solving the master equations that arise in the description of complex reactions. The objective of the present work is to furnish a completely general and exact theory of reaction rates, in a form reminiscent of transition state theory, valid for all fluid phases and also to develop a computer program that can solve complex reactions by finding the concentrations of all participating substances as a function of time. To do so, the full quantum scattering theory is used for deriving the exact rate law, and then the resulting cumulative reaction probability is put into several equivalent forms that take into account all relativistic effects if applicable, including one that is strongly reminiscent of transition state theory, but includes corrections from scattering theory. Then two programs, one for solving complex reactions, the other for solving first order linear kinetic master equations to solve them, have been developed and tested for simple applications.
Resumo:
The authors’ review of literature about Bandura’s (1977) social learning theory and self-efficacy leads to implications on how this theory can positively affect prison work release programs and inmate post-release outcomes. Additionally, several causes of deviant behavior have been explained by social learning theory concepts.
Resumo:
The purpose of this study was to compare the characteristics of effective clinical and theory instructors as perceived by LPN/RN versus generic students in an associate degree nursing program.^ Data were collected from 508 students during the 1996-7 academic year from three NLN accredited associate degree nursing programs. The researcher developed instrument consisted of three parts: (a) Whitehead Characteristics of Effective Clinical Instructor Rating Scale, (b) Whitehead Characteristics of Effective Theory Instructor Rating Scale, and (c) Demographic Data Sheet. The items were listed under five major categories identified in the review of the literature: (a) interpersonal relationships, (b) personality traits, (c) teaching practices, (d) knowledge and experience, and (e) evaluation procedures. The instrument was administered to LPN/RN students in their first semester and to generic students in the third semester of an associate degree nursing program.^ Data was analyzed using a one factor mutivariate analysis of variance (MANOVA). Further t tests were carried out to explore for possible differences between type of student and by group. Crosstabulations of the demographic data were analyzed.^ There were no significant differences found between the LPN/RN versus generic students on their perceptions of either effective theory or effective clinical instructor characteristics. There were significant differences between groups on several of the individual items. There was no significant interaction between group and ethnicity or group and age on the five major categories for either of the two instruments. There was a significant main effect of ethnicity on several of the individual items.^ The differences between the means and standard deviations on both instruments were small, suggesting that all of the characteristics listed for effective theory and clinical instructors were important to both groups of students. Effective teaching behaviors, as indicated on the survey instruments, should be taught to students in graduate teacher education programs. These behaviors should also be discussed by faculty coordinators supervising adjunct faculty. Nursing educators in associate degree nursing programs should understand theories of adult learning and implement instructional strategies to enhance minority student success. ^
Resumo:
This study assesses and describes the perception of clinical competency and the relationship to clinical practice of full-time nursing faculty in the associate degree nursing programs in the state of Florida. The study was developed around one major hypothesis and four research questions. The Hygiene-Motivators Theory proposed by Herzberg, Mausner, and Snyderman (1959) provided the conceptual framework to explain factors that would motivate a person to expand workload and maintain job satisfaction.^ Data were collected from the 244 faculty members teaching full-time at the 15 associate degree schools of nursing accredited by the National League for Nursing in the state of Florida. A total of 186 faculty (76%) responded and 175 (72%) cases were used for data analysis.^ Two instruments were modified and combined for the investigation. The instruments were the Faculty Perception of Practice Questionnaire (Parascenzo, 1983) and a three-part Attributes Deemed Necessary for Faculty to Proclaim Clinical Competency (Smith, 1991) scale. Computer analyses employing descriptive and inferential statistics were performed.^ The findings revealed that faculty were closely divided as to practice activities with more faculty nonpracticing than practicing. Factors identified as impediments to increased clinical practice were identified as teaching load and personal/family responsibilities that lead to a lack of time and lack of opportunity. Those faculty who practice did so as moonlighters in positions that would not require advanced training. Both the practicing and nonpracticing faculty reported a high level of satisfaction with their activities as a means of maintaining clinical practice. While both groups reported a high level of expertise, those practicing faculty perceived themselves to be more clinically competent on the attributes of knowledge, skills, and on the total attribute scale. It was further revealed that perception of competency declined with the length of time spent out of practice. There was no difference in the two groups on the attributes of values/attitude. ^
Resumo:
The purpose of this study was to determine the knowledge and use of critical thinking teaching strategies by full-time and part-time faculty in Associate Degree Nursing (ADN) programs. ^ Sanders CTI (1992) instrument was adapted for this study and pilot-tested prior to the general administration to ADN faculty in Southeast Florida. This modified instrument, now termed the Burroughs Teaching Strategy Inventory (BTSI), returned reliability estimates (Cronbach alphas of .71, .74, and .82 for the three constructs) comparable to the original instrument. The BTSI was administered to 113 full-time and part-time nursing faculty in three community college nursing programs. The response rate was 92% for full-time faculty (n = 58) and 61% for part-time faculty (n = 55). ^ The majority of participants supported a combined definition of critical thinking in nursing which represented a composite of thinking skills that included reflective thinking, assessing alternative viewpoints, and the use of problem-solving. Full-time and part-time faculty used different teaching strategies. Full-time faculty most often used multiple-choice exams and lecture while part-time faculty most frequently used discussion within their classes. One possible explanation for specific strategy choices and differences might be that full-time faculty taught predominately in theory classes where certain strategies would be more appropriate and part-time faculty taught predominately clinical classes. Both faculty types selected written nursing care plans as the second most effective critical thinking strategy. ^ Faculty identified several strategies as being effective in teaching critical thinking. These strategies included discussion, case studies, higher order questioning, and concept analysis. These however, were not always the strategies that were used in either the classroom or clinical setting. ^ Based on this study, the author recommends that if the profession continues to stress critical thinking as a vital component of practice, nursing faculty should receive education in appropriate critical teaching strategies. Both in-service seminars and workshops could be used to further the knowledge and use of critical thinking strategies by faculty. Qualitative research should be done to determine why nursing faculty use self-selected teaching strategies. ^
Resumo:
The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^
Resumo:
This study explored the strategies that community-based, consumer-focused advocacy, alternative service organizations (ASOs), implemented to adapt to the changes in the nonprofit funding environment (Oliver & McShane, 1979; Perlmutter, 1988a, 1994). It is not clear as to the extent to which current funding trends have influenced ASOs as little empirical research has been conducted in this area (Magnus, 2001; Marquez, 2003; Powell, 1986). ^ This study used a qualitative research design to investigate strategies implemented by these organizations to adapt to changes such as decreasing government, foundation, and corporate funding and an increasing number of nonprofit organizations. More than 20 community informants helped to identify, locate, and provide information about ASOs. Semi-structured interviews were conducted with a sample of 30 ASO executive directors from diverse organizations in Miami-bade and Broward Counties, in South Florida. ^ Data analysis was facilitated by the use of ATLAS.ti, version 5, a qualitative data analysis computer software program designed for grounded theory research. This process generated five major themes: Funding Environment; Internal Structure; Strategies for Survival; Sustainability; and Committing to the Cause, Mission, and Vision. ^ The results indicate that ASOs are struggling to survive financially by cutting programs, decreasing staff, and limiting service to consumers. They are also exploring ways to develop fundraising strategies; for example, increasing the number of proposals written for grants, focusing on fund development, and establishing for-profit ventures. Even organizations that state that they are currently financially stable are concerned about their financial vulnerability. There is little flexibility or cushioning to adjust to "funding jolts." The fear of losing current funding levels and being placed in a tenuous financial situation is a constant concern for these ASOs. ^ Further data collected from the self-administered Funding Checklist and demographic forms were coded and analyzed using Statistical Package for the Social Sciences (SPSS). Descriptive information and frequencies generated findings regarding the revenue, staff compliment, use of volunteers and fundraising consultants, and fundraising practices. The study proposes a model of funding relationships and presents implications for social work practice, and policy, along with recommendations for future research. ^
Resumo:
Most experiments in particle physics are scattering experiments, the analysis of which leads to masses, scattering phases, decay widths and other properties of one or multi-particle systems. Until the advent of Lattice Quantum Chromodynamics (LQCD) it was difficult to compare experimental results on low energy hadron-hadron scattering processes to the predictions of QCD, the current theory of strong interactions. The reason being, at low energies the QCD coupling constant becomes large and the perturbation expansion for scattering; amplitudes does not converge. To overcome this, one puts the theory onto a lattice, imposes a momentum cutoff, and computes the integral numerically. For particle masses, predictions of LQCD agree with experiment, but the area of decay widths is largely unexplored. ^ LQCD provides ab initio access to unusual hadrons like exotic mesons that are predicted to contain real gluonic structure. To study decays of these type resonances the energy spectra of a two-particle decay state in a finite volume of dimension L can be related to the associated scattering phase shift δ(k) at momentum k through exact formulae derived by Lüscher. Because the spectra can be computed using numerical Monte Carlo techniques, the scattering phases can thus be determined using Lüscher's formulae, and the corresponding decay widths can be found by fitting Breit-Wigner functions. ^ Results of such a decay width calculation for an exotic hybrid( h) meson (JPC = 1-+) are presented for the decay channel h → πa 1. This calculation employed Lüscher's formulae and an approximation of LQCD called the quenched approximation. Energy spectra for the h and πa1 systems were extracted using eigenvalues of a correlation matrix, and the corresponding scattering phase shifts were determined for a discrete set of πa1 momenta. Although the number of phase shift data points was sparse, fits to a Breit-Wigner model were made, resulting in a decay width of about 60 MeV. ^
Resumo:
Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA’s behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.
Resumo:
Proofs by induction are central to many computer science areas such as data structures, theory of computation, programming languages, program efficiency-time complexity, and program correctness. Proofs by induction can also improve students’ understanding and performance of computer science concepts such as programming languages, algorithm design, and recursion, as well as serve as a medium for teaching them. Even though students are exposed to proofs by induction in many courses of their curricula, they still have difficulties understanding and performing them. This impacts the whole course of their studies, since proofs by induction are omnipresent in computer science. Specifically, students do not gain conceptual understanding of induction early in the curriculum and as a result, they have difficulties applying it to more advanced areas later on in their studies. The goal of my dissertation is twofold: (1) identifying sources of computer science students’ difficulties with proofs by induction, and (2) developing a new approach to teaching proofs by induction by way of an interactive and multimodal electronic book (e-book). For the first goal, I undertook a study to identify possible sources of computer science students’ difficulties with proofs by induction. Its results suggest that there is a close correlation between students’ understanding of inductive definitions and their understanding and performance of proofs by induction. For designing and developing my e-book, I took into consideration the results of my study, as well as the drawbacks of the current methodologies of teaching proofs by induction for computer science. I designed my e-book to be used as a standalone and complete educational environment. I also conducted a study on the effectiveness of my e-book in the classroom. The results of my study suggest that, unlike the current methodologies of teaching proofs by induction for computer science, my e-book helped students overcome many of their difficulties and gain conceptual understanding of proofs induction.
Resumo:
Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).