971 resultados para Computer Reading Program
Resumo:
This study sought to determine if participation in a home education learning program would impact the perceived levels of parental self-efficacy of parents/caregivers who participate in the completion of home-learning assignments and increase their levels of home-learning involvement practices. Also, the study examined the relationship between the parental involvement practice of completing interactive home-learning assignments and the reading comprehension achievement of first grade students. A total of 146 students and their parents/caregivers representing a convenience sample of eight first grade classes participated in the study. Four classes (n=74) were selected as the experimental group and four classes (n=72) served as the control group. . There were 72 girls in the sample and 74 boys and the median age was 6 years 6 months. The study employed a quasi-experimental research design utilizing eight existing first grade classes. It examined the effects of a home-learning support intervention program on the perceived efficacy levels of the participating parents/care¬givers, as measured by the Parent Perceptions of Parent Efficacy Scale (Hoover-Dempsey, Bassler, & Brissie, 1992) administered on a pre/post basis. The amount and type of parent involvement in the completion of home assignments was determined by means of a locally developed instrument, the H.E.L.P. Parent Involvement Home-learning Scale, administered on a pre/post basis. Student achievement in reading comprehension was measured via the reading subtest of the Brigance, CIB-S pre and post. The elementary students and their parents/caregivers participated in an interactive home-learning intervention program for 12 weeks that required parent/caregiver assistance. Results revealed the experimental group of parents/caregivers had a significant increase in their levels of perceived self-efficacy, p<.001, from the pre to post, and also had significantly increased levels of parental involvement in seven home-learning activities, p<.001, than the control group parents/caregivers. The experimental group students demonstrated significantly higher reading levels than the control group students, p<.001. This study provided evidence that interactive home-learning activities improved the levels of parental self-efficacy and parental involvement in home-learning activities, and improved the reading comprehension of the experimental group in comparison to the control.
Resumo:
Safeguarding organizations against opportunism and severe deception in computer-mediated communication (CMC) presents a major challenge to CIOs and IT managers. New insights into linguistic cues of deception derive from the speech acts innate to CMC. Applying automated text analysis to archival email exchanges in a CMC system as part of a reward program, we assess the ability of word use (micro-level), message development (macro-level), and intertextual exchange cues (meta-level) to detect severe deception by business partners. We empirically assess the predictive ability of our framework using an ordinal multilevel regression model. Results indicate that deceivers minimize the use of referencing and self-deprecation but include more superfluous descriptions and flattery. Deceitful channel partners also over structure their arguments and rapidly mimic the linguistic style of the account manager across dyadic e-mail exchanges. Thanks to its diagnostic value, the proposed framework can support firms’ decision-making and guide compliance monitoring system development.
Resumo:
In this study three chronicles from national newspapers (one generalist and two sport press) were analyzed. The chronicles belong to Spain’s soccer final of the King’s Cup in 2014. The aim of the study was to know if there was any influence on the readers’ perception of justice and consequently if this influence could cause a particular predisposition to participate in acts of protest. 462 university students participated. The results showed that different chronicles caused differences in the perception of justice depending on the chronicle read. However, a clear influence on the willingness to participate in acts of protest was not obtained. These results should make us think about the impact of sport press and its influence, and to be aware of the indirect responsibility of every sector on the antisocial behaviors generated by soccer in our country.
Resumo:
Este artículo presenta una investigación en la que se analizan las dificultades del profesorado para planificar, coordinar y evaluar competencias claves en una muestra de 23 centros educativos. El tema tiene hondas repercusiones ya que una mala praxis educativa de las competencias claves puede conculcar uno de los derechos fundamentales del alumnado a ser evaluado de forma objetiva (LODE: Art.6b y RD 732/1995: Art. 13.1) y poder superar las pruebas de evaluación consideradas necesarias para la obtención del título académico mínimo que otorga el estado español. La investigación se ha desarrollado desde una doble perspectiva metodológica; en primer lugar, es una investigación descriptiva en la que presentamos las características fundamentales de las competencias claves y la normativa básica para su desarrollo y evaluación. En segundo lugar, aplicamos un procedimiento de análisis con una doble vertiente cualitativa mediante el empleo del programa Atlas-Ti y del enfoque reticular-categorial del análisis de redes sociales con la aplicación de UCINET y el visor yED Graph Editor para abordar el análisis de las principales dificultades y obstáculos detectados. Los resultados muestran que existen serias dificultades en las tres dimensiones analizadas: "planificación", "coordinación" y "evaluación" de competencias clave; especialmente en la necesidad de formación del profesorado, en la evaluación de las competencias, en la metodología para su desarrollo y en los procesos de coordinación interna para su consecución en los centros educativos.
Resumo:
This report is the product of a first-year research project in the University Transportation Centers Program. This project was carried out by an interdisciplinary research team at The University of Iowa's Public Policy Center. The project developed a computerized system to support decisions on locating facilities that serve rural areas while minimizing transportation costs. The system integrates transportation databases with algorithms that specify efficient locations and allocate demand efficiently to service regions; the results of these algorithms are used interactively by decision makers. The authors developed documentation for the system so that others could apply it to estimate the transportation and route requirements of alternative locations and identify locations that meet certain criteria with the least cost. The system was developed and tested on two transportation-related problems in Iowa, and this report uses these applications to illustrate how the system can be used.
Resumo:
The Property and Equipment Department has a central supply of automotive parts, tools, and maintenance supplies. This central supply is used to supply the repair shop and also to supply parts to the various field garages and all departments of the Commission. The old procedure involved keeping track manually of all of the parts, which involved some 22,000 items. All records, billings, arid re-order points were kept manually. Mani times the re-order points were located by reaching into a bin and finding nothing there. Desiring to improve this situation, an inventory control system was established for use on the computer. A complete record of the supplies that are stored in the central warehouse was prepared and this information was used to make a catalog. Each time an item is issued or received, it is processed through the inventory program. When the re-order point is reached, a notice is given to reorder. The procedure for taking inventory has been improved. A voucher invoice is now prepared by the computer for all issues to departments. These are some of the many benefits that have been de rived from this system.
Resumo:
Abstract : Since at least the 1980's, a growing number of companies have set up an ethics or a compliance program within their organization. However, in the field of study of business management, there is a paucity of research studies concerning these management systems. This observation warranted the present investigation of one company's compliance program. Compliance programs are set up so that individuals working within an organization observe the laws and regulations which pertain to their work. This study used a constructivist grounded theory methodology to examine the process by which a specific compliance program, that of Siemens Canada Limited, was implemented throughout its organization. In conformity with this methodology, instead of proceeding with the investigation in accordance to a particular theoretical framework, the study established a number of theoretical constructs used strictly as reference points. The study's research question was stated as: what are the characteristics of the process by which Siemens' compliance program integrated itself into the existing organizational structure and gained employee acceptance? Data consisted of documents produced by the company and of interviews done with twenty-four managers working for Siemens Canada Limited. The researcher used QSR-Nvivo computer assisted software to code transcripts and to help with analyzing interviews and documents. Triangulation was done by using a number of analysis techniques and by constantly comparing findings with extant theory. A descriptive model of the implementation process grounded in the experience of participants and in the contents of the documents emerged from the data. The process was called "Remolding"; remolding being the core category having emerged. This main process consisted of two sub-processes identified as "embedding" and "appraising." The investigation was able to provide a detailed account of the appraising process. It identified that employees appraised the compliance program according to three facets: the impact of the program on the employee's daily activities, the relationship employees have with the local compliance organization, and the relationship employees have with the corporate ethics identity. The study suggests that a company who is entertaining the idea of implementing a compliance program should consider all three facets. In particular, it suggests that any company interested in designing and implementing a compliance program should pay particular attention to its corporate ethics identity. This is because employee's acceptance of the program is influenced by their comparison of the company's ethics identity to their local ethics identity. Implications of the study suggest that personnel responsible for the development and organizational support of a compliance program should understand the appraisal process by which employees build their relationship with the program. The originality of this study is that it points emphatically that companies must pay special attention in developing a corporate ethics identify which is coherent, well documented and well explained.
Resumo:
We explore the relationships between the construction of a work of art and the crafting of a computer program in Java and suggest that the structure of paintings and drawings may be used to teach the fundamental concepts of computer programming. This movement "from Art to Science", using art to drive computing, complements the common use of computing to inform art. We report on initial experiences using this approach with undergraduate and postgraduate students. An embryonic theory of the correspondence between art and computing is presented and a methodology proposed to develop this project further.
Resumo:
Dagens dataloggare har många funktioner vilket avspeglas i programvaran som används för att kommunicera med dem. De har fler funktioner än vad enskilda företag och privatpersoner behöver vilket gör programvaran onödigt komplicerad. Genom att minska antalet inställningsmöjligheter kan programvaran göras mindre, snabbare och lättare att lära sig. Arbetet utfördes hos Inventech Europe AB som tillhandahöll dataloggare för temperatur- och fuktighetsmätning. De ville undersöka möjligheterna att utveckla ett program som personer med begränsad datorvana snabbt kunde lära sig att använda. Därför var syftet med detta arbete att utreda hur ett sådant program kunde se ut. Arbetets fokus låg på designprocessen. Genom olika UML-diagram visualiserades de olika momenten i processen. Då projektet var relativt litet valdes en utvecklingsprocess som följer vattenfallsmodellen där de olika stegen (specifikation, design, implementation, test) utförs i följd. Det förutsätter att ett steg är färdigt innan nästa steg påbörjas. Modellen fungerar bäst när projektet är mindre och väldefinierat. Tyvärr ändrades företagets krav på hur programmet skulle fungera flera gånger under arbetets gång. Därmed borde en mer flexibel utvecklingsprocess ha valts för att ge utrymme för förändringar som kunde uppkomma under projektets gång. Slutresultatet blev en funktionsprototyp som var lätt att använda och inte hade fler inställningsmöjligheter än nödvändigt. Funktionsprototyp kan användas som bas för att lägga till egen skräddarsydd funktionalitet. För att visa detta inkluderades ytterligare två funktioner. En av funktionerna var möjligheten att kunna spara insamlad data till en extern databas som sedan kunde användas som källa till andra program vilka exempelvis skulle kunna visualisera data med hjälp av olika grafer. För att lätt kunna identifiera olika inkopplade dataloggare inkluderades även möjligheten att namnge de olika enheterna.
Resumo:
Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.
Resumo:
The protein folding problem has been one of the most challenging subjects in biological physics due to its complexity. Energy landscape theory based on statistical mechanics provides a thermodynamic interpretation of the protein folding process. We have been working to answer fundamental questions about protein-protein and protein-water interactions, which are very important for describing the energy landscape surface of proteins correctly. At first, we present a new method for computing protein-protein interaction potentials of solvated proteins directly from SAXS data. An ensemble of proteins was modeled by Metropolis Monte Carlo and Molecular Dynamics simulations, and the global X-ray scattering of the whole model ensemble was computed at each snapshot of the simulation. The interaction potential model was optimized and iterated by a Levenberg-Marquardt algorithm. Secondly, we report that terahertz spectroscopy directly probes hydration dynamics around proteins and determines the size of the dynamical hydration shell. We also present the sequence and pH-dependence of the hydration shell and the effect of the hydrophobicity. On the other hand, kinetic terahertz absorption (KITA) spectroscopy is introduced to study the refolding kinetics of ubiquitin and its mutants. KITA results are compared to small angle X-ray scattering, tryptophan fluorescence, and circular dichroism results. We propose that KITA monitors the rearrangement of hydrogen bonding during secondary structure formation. Finally, we present development of the automated single molecule operating system (ASMOS) for a high throughput single molecule detector, which levitates a single protein molecule in a 10 µm diameter droplet by the laser guidance. I also have performed supporting calculations and simulations with my own program codes.
Resumo:
This article introduces a theoretical framework for the analysis of the player character (PC) in offline computer role-playing games (cRPGs). It derives from the assumption that the character constitutes the focal point of the game, around which all the other elements revolve. This underlying observation became the foundation of the Player Character Grid and its constituent Pivot Player Character Model, a conceptual framework illustrating the experience of gameplay as perceived through the PC’s eyes. Although video game characters have been scrutinised from many different perspectives, a systematic framework has not been introduced yet. This study aims to fill that void by proposing a model replicable across the cRPG genre. It has been largely inspired by Anne Ubersfeld’s semiological dramatic character research implemented in Reading Theatre I (1999) and is demonstrated with reference to The Witcher (CD Projekt RED 2007).
Resumo:
In this article, the change in examinee effort during an assessment, which we will refer to as persistence, is modeled as an effect of item position. A multilevel extension is proposed to analyze hierarchically structured data and decompose the individual differences in persistence. Data from the 2009 Program of International Student Achievement (PISA) reading assessment from N = 467,819 students from 65 countries are analyzed with the proposed model, and the results are compared across countries. A decrease in examinee effort during the PISA reading assessment was found consistently across countries, with individual differences within and between schools. Both the decrease and the individual differences are more pronounced in lower performing countries. Within schools, persistence is slightly negatively correlated with reading ability; but at the school level, this correlation is positive in most countries. The results of our analyses indicate that it is important to model and control examinee effort in low-stakes assessments. (DIPF/Orig.)
Resumo:
The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.
Resumo:
Many children in the United States begin kindergarten unprepared to converse in the academic language surrounding instruction, putting them at greater risk for later language and reading difficulties. Importantly, correlational research has shown there are certain experiences prior to kindergarten that foster the oral language skills needed to understand and produce academic language. The focus of this dissertation was on increasing one of these experiences: parent-child conversations about abstract and non-present concepts, known as decontextualized language (DL). Decontextualized language involves talking about non-present concepts such as events that happened in the past or future, or abstract discussions such as providing explanations or defining unknown words. As caregivers’ decontextualized language input to children aged three to five is consistently correlated with kindergarten oral language skills and later reading achievement, it is surprising no experimental research has been done to establish this relation causally. The study described in this dissertation filled this literature gap by designing, implementing, and evaluating a decontextualized language training program for parents of 4-year-old children (N=30). After obtaining an initial measure of decontextualized language, parents were randomly assigned to a control condition or training condition, the latter of which educated parents about decontextualized language and why it is important. All parents then audio-recorded four mealtime conversations over the next month, which were transcribed and reliably coded for decontextualized language. Results indicated that trained parents boosted their DL from roughly 17 percent of their total utterances at baseline to approximately 50 percent by the mid-point of the study, and remained at these boosted levels throughout the duration of the study. Children’s DL was also boosted by similar margins, but no improvement in children’s oral language skills was observed, measured prior to, and one month following training. Further, exploratory analyses pointed to parents’ initial use of DL and their theories of the malleability of intelligence (i.e., growth mindsets) as moderators of training gains. Altogether, these findings are a first step in establishing DL as a viable strategy for giving children the oral language skills needed to begin kindergarten ready to succeed in the classroom.