14 resultados para Application specific instruction-set processor

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pronunciation training has been traditionally viewed as of limited importance in a communicatively oriented foreign language curriculum (Pennington & Richards, 1986). Many language instructors seemingly deny the usefulness of phonetic training and rely on a listen-and-repeat method with the use of audiotapes (Bate, 1989; Callamand & Pedoya, 1984; Jones, 1997). Beginners in French classes face the challenge of mastering a complex sound and grapheme-phoneme correspondence system without the benefit of specific instruction. Their pronunciation errors develop mostly from bad habits while decoding from print to sound (Dansereau, 1995). ^ The purpose of this study was to investigate the effectiveness of basic phonetic/phonics instruction on reading pronunciation accuracy in a French I language course. ^ The sample consisted of two groups of French I students from Florida International University, who received the same instruction in French language and culture during the fall semester of 1999. Only the experimental group received additional phonetic/phonics training. ^ The instrument consisted of three recorded reading tasks: isolated familiar words, isolated unfamiliar words, and dialogue. Research questions were analyzed using a one-way multivariate analysis of variance. Significant differences were found between the two groups on scores for each of the three sections of the instrument, and on the total scores. These findings support the hypothesis of the study and reveal the effectiveness of phonetic/phonics training for beginners of French. ^ The findings imply that beginning language students should receive the minimum knowledge they need to master the French phoneme-grapheme (sound-spelling) system. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To carry out their specific roles in the cell, genes and gene products often work together in groups, forming many relationships among themselves and with other molecules. Such relationships include physical protein-protein interaction relationships, regulatory relationships, metabolic relationships, genetic relationships, and much more. With advances in science and technology, some high throughput technologies have been developed to simultaneously detect tens of thousands of pairwise protein-protein interactions and protein-DNA interactions. However, the data generated by high throughput methods are prone to noise. Furthermore, the technology itself has its limitations, and cannot detect all kinds of relationships between genes and their products. Thus there is a pressing need to investigate all kinds of relationships and their roles in a living system using bioinformatic approaches, and is a central challenge in Computational Biology and Systems Biology. This dissertation focuses on exploring relationships between genes and gene products using bioinformatic approaches. Specifically, we consider problems related to regulatory relationships, protein-protein interactions, and semantic relationships between genes. A regulatory element is an important pattern or "signal", often located in the promoter of a gene, which is used in the process of turning a gene "on" or "off". Predicting regulatory elements is a key step in exploring the regulatory relationships between genes and gene products. In this dissertation, we consider the problem of improving the prediction of regulatory elements by using comparative genomics data. With regard to protein-protein interactions, we have developed bioinformatics techniques to estimate support for the data on these interactions. While protein-protein interactions and regulatory relationships can be detected by high throughput biological techniques, there is another type of relationship called semantic relationship that cannot be detected by a single technique, but can be inferred using multiple sources of biological data. The contributions of this thesis involved the development and application of a set of bioinformatic approaches that address the challenges mentioned above. These included (i) an EM-based algorithm that improves the prediction of regulatory elements using comparative genomics data, (ii) an approach for estimating the support of protein-protein interaction data, with application to functional annotation of genes, (iii) a novel method for inferring functional network of genes, and (iv) techniques for clustering genes using multi-source data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Access control (AC) is a necessary defense against a large variety of security attacks on the resources of distributed enterprise applications. However, to be effective, AC in some application domains has to be fine-grain, support the use of application-specific factors in authorization decisions, as well as consistently and reliably enforce organization-wide authorization policies across enterprise applications. Because the existing middleware technologies do not provide a complete solution, application developers resort to embedding AC functionality in application systems. This coupling of AC functionality with application logic causes significant problems including tremendously difficult, costly and error prone development, integration, and overall ownership of application software. The way AC for application systems is engineered needs to be changed. ^ In this dissertation, we propose an architectural approach for engineering AC mechanisms to address the above problems. First, we develop a framework for implementing the role-based access control (RBAC) model using AC mechanisms provided by CORBA Security. For those application domains where the granularity of CORBA controls and the expressiveness of RBAC model suffice, our framework addresses the stated problem. ^ In the second and main part of our approach, we propose an architecture for an authorization service, RAD, to address the problem of controlling access to distributed application resources, when the granularity and support for complex policies by middleware AC mechanisms are inadequate. Applying this architecture, we developed a CORBA-based application authorization service (CAAS). Using CAAS, we studied the main properties of the architecture and showed how they can be substantiated by employing CORBA and Java technologies. Our approach enables a wide-ranging solution for controlling the resources of distributed enterprise applications. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this investigation was to develop and implement a general purpose VLSI (Very Large Scale Integration) Test Module based on a FPGA (Field Programmable Gate Array) system to verify the mechanical behavior and performance of MEM sensors, with associated corrective capabilities; and to make use of the evolving System-C, a new open-source HDL (Hardware Description Language), for the design of the FPGA functional units. System-C is becoming widely accepted as a platform for modeling, simulating and implementing systems consisting of both hardware and software components. In this investigation, a Dual-Axis Accelerometer (ADXL202E) and a Temperature Sensor (TMP03) were used for the test module verification. Results of the test module measurement were analyzed for repeatability and reliability, and then compared to the sensor datasheet. Further study ideas were identified based on the study and results analysis. ASIC (Application Specific Integrated Circuit) design concepts were also being pursued.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Access control (AC) is a necessary defense against a large variety of security attacks on the resources of distributed enterprise applications. However, to be effective, AC in some application domains has to be fine-grain, support the use of application-specific factors in authorization decisions, as well as consistently and reliably enforce organization-wide authorization policies across enterprise applications. Because the existing middleware technologies do not provide a complete solution, application developers resort to embedding AC functionality in application systems. This coupling of AC functionality with application logic causes significant problems including tremendously difficult, costly and error prone development, integration, and overall ownership of application software. The way AC for application systems is engineered needs to be changed. In this dissertation, we propose an architectural approach for engineering AC mechanisms to address the above problems. First, we develop a framework for implementing the role-based access control (RBAC) model using AC mechanisms provided by CORBA Security. For those application domains where the granularity of CORBA controls and the expressiveness of RBAC model suffice, our framework addresses the stated problem. In the second and main part of our approach, we propose an architecture for an authorization service, RAD, to address the problem of controlling access to distributed application resources, when the granularity and support for complex policies by middleware AC mechanisms are inadequate. Applying this architecture, we developed a CORBA-based application authorization service (CAAS). Using CAAS, we studied the main properties of the architecture and showed how they can be substantiated by employing CORBA and Java technologies. Our approach enables a wide-ranging solution for controlling the resources of distributed enterprise applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this descriptive study was to evaluate the banking and insurance technology curriculum at ten junior colleges in Taiwan. The study focused on curriculum, curriculum materials, instruction, support services, student achievement and job performance. Data was collected from a diverse sample of faculty, students, alumni, and employers. ^ Questionnaires on the evaluation of curriculum at technical junior colleges were developed for use in this specific case. Data were collected from the sample described above and analyzed utilizing ANOVA, T-Tests and crosstabulations. Findings are presented which indicate that there is room for improvement in terms of meeting individual students' needs. ^ Using Stufflebeam's CIPP model for curriculum evaluation it was determined that the curriculum was adequate in terms of the knowledge and skills imparted to students. However, students were dissatisfied with the rigidity of the curriculum and the lack of opportunity to satisfy the individual needs of students. Employers were satisfied with both the academic preparation of students and their on the job performance. ^ In sum, the curriculum of the two-year banking and insurance technology programs of junior college in Taiwan was shown to have served adequately preparing a work force to enter businesses. It is now time to look toward the future and adapt the curriculum and instruction for the future needs of the ever evolving high-tech society. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this research was to demonstrate the applicability of reduced-size STR (Miniplex) primer sets to challenging samples and to provide the forensic community with new information regarding the analysis of degraded and inhibited DNA. The Miniplex primer sets were validated in accordance with guidelines set forth by the Scientific Working Group on DNA Analysis Methods (SWGDAM) in order to demonstrate the scientific validity of the kits. The Miniplex sets were also used in the analysis of DNA extracted from human skeletal remains and telogen hair. In addition, a method for evaluating the mechanism of PCR inhibition was developed using qPCR. The Miniplexes were demonstrated to be a robust and sensitive tool for the analysis of DNA with as low as 100 pg of template DNA. They also proved to be better than commercial kits in the analysis of DNA from human skeletal remains, with 64% of samples tested producing full profiles, compared to 16% for a commercial kit. The Miniplexes also produced amplification of nuclear DNA from human telogen hairs, with partial profiles obtained from as low as 60 pg of template DNA. These data suggest smaller PCR amplicons may provide a useful alternative to mitochondrial DNA for forensic analysis of degraded DNA from human skeletal remains, telogen hairs, and other challenging samples. In the evaluation of inhibition by qPCR, the effect of amplicon length and primer melting temperature was evaluated in order to determine the binding mechanisms of different PCR inhibitors. Several mechanisms were indicated by the inhibitors tested, including binding of the polymerase, binding to the DNA, and effects on the processivity of the polymerase during primer extension. The data obtained from qPCR illustrated a method by which the type of inhibitor could be inferred in forensic samples, and some methods of reducing inhibition for specific inhibitors were demonstrated. An understanding of the mechanism of the inhibitors found in forensic samples will allow analysts to select the proper methods for inhibition removal or the type of analysis that can be performed, and will increase the information that can be obtained from inhibited samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In his essay - Toward a Better Understanding of the Evolution of Hotel Development: A Discussion of Product-Specific Lodging Demand - by John A. Carnella, Consultant, Laventhol & Horwath, cpas, New York, Carnella initially describes his piece by stating: “The diversified hotel product in the united states lodging market has Resulted in latent room-night demand, or supply-driven demand resulting from the introduction of a lodging product which caters to a specific set of hotel patrons. The subject has become significant as the lodging market has moved toward segmentation with regard to guest room offerings. The author proposes that latent demand is a tangible, measurable phenomenon best understood in light of the history of the guest room product from its infancy to its present state.” The article opens with an ephemeral depiction of hotel development in the United States, both pre’ and post World War II. To put it succinctly, the author wants you to know that the advent of the inter-state highway system changed the complexion of the hotel industry in the U.S. “Two essential ingredients were necessary for the next phase of hotel development in this country. First was the establishment of the magnificently intricate infrastructure which facilitated motor vehicle transportation in and around the then 48 states of the nation,” says Carnella. “The second event…was the introduction of affordable highway travel. Carnella goes on to say that the next – big thing – in hotel evolution was the introduction of affordable air travel. “With the airways filled with potential lodging guests, developers moved next to erect a new genre of hotel, the airport hotel,” Carnella advances his picture. Growth progressed with the arrival of the suburban hotel concept, which wasn’t fueled by developments in transportation, but by changes in people’s living habits, i.e. suburban affiliations as opposed to urban and city population aggregates. The author explores the distinctions between full-service and limited service lodging operations. “The market of interest with consideration to the extended-stay facility is one dominated by corporate office parks,” Carnella proceeds. These evolutional states speak to latent demand, and even further to segmentation of the market. “Latent demand… is a product-generated phenomenon in which the number of potential hotel guests increases as the direct result of the introduction of a new lodging facility,” Carnella brings his unique insight to the table with regard to the specialization process. The demand is already there; just waiting to be tapped. In closing, “…there must be a consideration of the unique attributes of a lodging facility relative to its ability to attract guests to a subject market, just as there must be an examination of the property's ability to draw guests from within the subject market,” Carnella proposes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Everglades is a sub-tropical coastal wetland characterized among others by its hydrological features and deposits of peat. Formation and preservation of organic matter in soils and sediments in this wetland ecosystem is critical for its sustainability and hydrological processes are important divers in the origin, transport and fate of organic matter. With this in mind, organic matter dynamics in the greater Florida Everglades was studied though various organic geochemistry techniques, especially biomarkers, bulk and compound specific δ13C and δD isotope analysis. The main objectives were focused on how different hydrological regimes in this ecosystem control organic matter dynamics, such as the mobilization of particulate organic matter (POM) in freshwater marshes and estuaries, and how organic geochemistry techniques can be applied to reconstruct Everglades paleo-hydrology. For this purpose organic matter in typical vegetation, floc, surface soils, soil cores, and estuarine suspended particulates were characterized in samples selected along hydrological gradients in the Water Conservation Area 3, Shark River Slough and Taylor Slough. ^ This research focused on three general themes: (1) Assessment of the environmental dynamics and source-specific particulate organic carbon export in a mangrove-dominated estuary. (2) Assessment of the origin, transport and fate of organic matter in freshwater marsh. (3) Assessment of historical changes in hydrological conditions in the Everglades (paleo-hydrology) though biomarkes and compound specific isotope analyses. This study reports the first estimate of particulate organic carbon loss from mangrove ecosystems in the Everglades, provides evidence for particulate organic matter transport with regards to the formation of ridge and slough landscapes in the Everglades, and demonstrates the applicability of the combined biomarker and compound-specific stable isotope approach as a means to generate paleohydrological data in wetlands. The data suggests that: (1) Carbon loss from mangrove estuaries is roughly split 50/50 between dissolved and particulate carbon; (2) hydrological remobilization of particulate organic matter from slough to ridge environments may play an important role in the maintenance of the Everglades freshwater landscape; and (3) Historical changes in hydrology have resulted in significant vegetation shifts from historical slough type vegetation to present ridge type vegetation. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Students with specific learning disabilities (SLD) typically learn less history content than their peers without disabilities and show fewer learning gains. Even when they are provided with the same instructional strategies, many students with SLD struggle to grasp complex historical concepts and content area vocabulary. Many strategies involving technology have been used in the past to enhance learning for students with SLD in history classrooms. However, very few studies have explored the effectiveness of emerging mobile technology in K-12 history classrooms. This study investigated the effects of mobile devices (iPads) as an active student response (ASR) system on the acquisition of U.S. history content of middle school students with SLD. An alternating treatments single subject design was used to compare the effects of two interventions. There were two conditions and a series of pretest probesin this study. The conditions were: (a) direct instruction and studying from handwritten notes using the interactive notebook strategy and (b) direct instruction and studying using the Quizlet App on the iPad. There were three dependent variables in this study: (a) percent correct on tests, (b) rate of correct responses per minute, and (c) rate of errors per minute. A comparative analysis suggested that both interventions (studying from interactive notes and studying using Quizlet on the iPad) had varying degrees of effectiveness in increasing the learning gains of students with SLD. In most cases, both interventions were equally effective. During both interventions, all of the participants increased their percentage correct and increased their rate of correct responses. Most of the participants decreased their rate of errors. The results of this study suggest that teachers of students with SLD should consider a post lesson review in the form of mobile devices as an ASR system or studying from handwritten notes paired with existing evidence-based practices to facilitate students’ knowledge in U.S. history. Future research should focus on the use of other interactive applications on various mobile operating platforms, on other social studies subjects, and should explore various testing formats such as oral question-answer and multiple choice.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study investigated the effects of repeated readings on the reading abilities of 4, third-, fourth-, and fifth-grade English language learners (ELLs) with specific learning disabilities (SLD). A multiple baseline probe design across subjects was used to explore the effects of repeated readings on four dependent variables: reading fluency (words read correctly per minute; wpm), number of errors per minute (epm), types of errors per minute, and answer to literal comprehension questions. Data were collected and analyzed during baseline, intervention, generalization probes, and maintenance probes. Throughout the baseline and intervention phases, participants read a passage aloud and received error correction feedback. During baseline, this was followed by fluency and literal comprehension question assessments. During intervention, this was followed by two oral repeated readings of the passage. Then the fluency and literal comprehension question assessments were administered. Generalization probes followed approximately 25% of all sessions and consisted of a single reading of a new passage at the same readability level. Maintenance sessions occurred 2-, 4-, and 6-weeks after the intervention ended. The results of this study indicated that repeated readings had a positive effect on the reading abilities of ELLs with SLD. Participants read more wpm, made fewer epm, and answered more literal comprehension questions correctly. Additionally, on average, generalization scores were higher in intervention than in baseline. Maintenance scores were varied when compared to the last day of intervention, however, with the exception of the number of hesitations committed per minute maintenance scores were higher than baseline means. This study demonstrated that repeated readings improved the reading abilities of ELLs with SLD and that gains were generalized to untaught passages. Maintenance probes 2-, 4-, and 6- weeks following intervention indicated that mean reading fluency, errors per minute, and correct answers to literal comprehensive questions remained above baseline levels. Future research should investigate the use of repeated readings in ELLs with SLD at various stages of reading acquisition. Further, future investigations may examine how repeated readings can be integrated into classroom instruction and assessments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Exclusionary school discipline results in students being removed from classrooms as a consequence of their disruptive behavior and may lead to subsequent suspension and/or expulsion. Literature documents that nondominant students, particularly Black males, are disproportionately impacted by exclusionary discipline, to the point that researchers from a variety of critical perspectives consider exclusionary school discipline an oppressive educational practice and condition. Little or no research examines specific teacher-student social interactions within classrooms that influence teachers’ decisions to use or not use exclusionary discipline. Therefore, this study set forth the central research question: In relation to classroom interactions in alternative education settings, what accounts for teachers’ use or non-use of exclusionary discipline with students? A critical social practice theory of learning served as the framework for exploring this question, and a critical microethnographic methodology informed the data collection and analysis. Criterion sampling was used to select four classrooms in the same alternative education school with two teachers who frequently and two who rarely used exclusionary discipline. Nine stages of data collection and reconstructive data analysis were conducted. Data collection involved video recorded classroom observations, digitally recorded interviews of teachers and students discussing selected video segments, and individual teacher interviews. Reconstructive data analysis procedures involved hermeneutic inferencing of possible underlying meanings, critical discourse analysis, interactive power analysis and role analysis, thematic analysis of the interactions in each classroom, and a final comparative analysis of the four classrooms. Four predominant themes of social interaction (resistance, conformism, accommodation, and negotiation) emerged with terminology adapted from Giroux’s (2001) theory of resistance in education and Third Space theory (Gutiérrez, 2008). Four types of power (normative, coercive, interactively established contracts, and charm), based on Carspecken’s (1996) typology, were found in the interactions between teacher and students in varying degrees for different purposes. This research contributes to the knowledge base on teacher-student classroom interactions, specifically in relation to exclusionary discipline. Understanding how the themes and varying power relations influence their decisions and actions may enable teachers to reduce use of exclusionary discipline and remain focused on positive teacher-student academic interactions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many culturally and linguistically diverse (CLD) students with specific learning disabilities (SLD) struggle with the writing process. Particularly, they have difficulties developing and expanding ideas, organizing and elaborating sentences, and revising and editing their compositions (Graham, Harris, & Larsen, 2001; Myles, 2002). Computer graphic organizers offer a possible solution to assist them in their writing. This study investigated the effects of a computer graphic organizer on the persuasive writing compositions of Hispanic middle school students with SLD. A multiple baseline design across subjects was used to examine its effects on six dependent variables: number of arguments and supporting details, number and percentage of transferred arguments and supporting details, planning time, writing fluency, syntactical maturity (measured by T-units, the shortest grammatical sentence without fragments), and overall organization. Data were collected and analyzed throughout baseline and intervention. Participants were taught persuasive writing and the writing process prior to baseline. During baseline, participants were given a prompt and asked to use paper and pencil to plan their compositions. A computer was used for typing and editing. Intervention required participants to use a computer graphic organizer for planning and then a computer for typing and editing. The planning sheets and written composition were printed and analyzed daily along with the time each participant spent on planning. The use of computer graphic organizers had a positive effect on the planning and persuasive writing compositions. Increases were noted in the number of supporting details planned, percentage of supporting details transferred, planning time, writing fluency, syntactical maturity in number of T-units, and overall organization of the composition. Minimal to negligible increases were noted in the mean number of arguments planned and written. Varying effects were noted in the percent of transferred arguments and there was a decrease in the T-unit mean length. This study extends the limited literature on the effects of computer graphic organizers as a prewriting strategy for Hispanic students with SLD. In order to fully gauge the potential of this intervention, future research should investigate the use of different features of computer graphic organizer programs, its effects with other writing genres, and different populations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The design of interfaces to facilitate user search has become critical for search engines, ecommercesites, and intranets. This study investigated the use of targeted instructional hints to improve search by measuring the quantitative effects of users' performance and satisfaction. The effects of syntactic, semantic and exemplar search hints on user behavior were evaluated in an empirical investigation using naturalistic scenarios. Combining the three search hint components, each with two levels of intensity, in a factorial design generated eight search engine interfaces. Eighty participants participated in the study and each completed six realistic search tasks. Results revealed that the inclusion of search hints improved user effectiveness, efficiency and confidence when using the search interfaces, but with complex interactions that require specific guidelines for search interface designers. These design guidelines will allow search designers to create more effective interfaces for a variety of searchapplications.