971 resultados para Computer Reading Program
Resumo:
Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs
Resumo:
This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods
Resumo:
This paper suggests an analysis of "Inanimate Alice" by Kate Pullinger, Chris Joseph and Ian Harper as an example of a transmedia narrative that triggers a new reading experience whilst proposing a literary alterity between reading and performance. Narrative experiences that elect the visual plasticity, interchanging games and tactility as drivers of the creative process are not new. Yet, narrative experiences, which have been created in the gap between reality and fiction, have found on the digital realm the perfect environment to multiple hybrid experiences. Bearing in mind Walter Benjamin’s concept of Erlebnis and Erfahrung, a critical analysis of this digital fiction tries to illustrate how literary art finds its space and time in a metamorphosed continuum only activated by the “patient reader”. All the multimedia hybrids, which this digital literary work may have, challenge readers to interpret different signals and poetic structures that most of readers might not be accustomed to; however even among a cognitive dissonance, meaning is found and reading happens only if time, space and attention are available. All possible transmedia literacies can only respond to this experience of online reading, if they are able to focus and draw attention not to a simple new behaviour or a single new practice, but to a demanding state of affairs that assemble different objective and subjective value forms.
Resumo:
Existent computer programming training environments help users to learn programming by solving problems from scratch. Nevertheless, initiating the resolution of a program can be frustrating and demotivating if the student does not know where and how to start. Skeleton programming facilitates a top-down design approach, where a partially functional system with complete high level structures is available, so the student needs only to progressively complete or update the code to meet the requirements of the problem. This paper presents CodeSkelGen - a program skeleton generator. CodeSkelGen generates skeleton or buggy Java programs from a complete annotated program solution provided by the teacher. The annotations are formally described within an annotation type and processed by an annotation processor. This processor is responsible for a set of actions ranging from the creation of dummy methods to the exchange of operator types included in the source code. The generator tool will be included in a learning environment that aims to assist teachers in the creation of programming exercises and to help students in their resolution.
Resumo:
Trabalho apresentado no âmbito do Doutoramento em Informática, como requisito parcial para obtenção do grau de Doutor em Informática
Resumo:
Objective: The aim of this study was to compare the factors of adherence to physical activity in subjects attending a cardiac rehabilitation program, and subjects who have withdrawal this same program using the Transtheoretical Model of behavior change. Methods: We conducted an observational, cross sectional type study, with a sample of 33 individuals (15 currently participating in the Cardiac Rehabilitation Program and 18 who no more attended the same program), with the questionnaires being personally delivered or sent by mail. For data analysis, we used the computer program SPSS® version 16.0. The significance level was set at 0.05. Results: There were no significant differences in the states of Change, Self-efficacy, Decisional Balance and Change Processes in both groups. We obtained a high Spearman correlation between States of Change and Self-efficacy (r2 = 0.778) and the Pros (r2 = 0.764) and Againsts (r2 = -0.744) in Decisional Balance. However, there were no significant evidence to affirm that States of Change and experiential processes of change (p = 0.465) andbehavioral (p = 0.300) had a correlation. A relationship was found, in terms of proportions between physical activity incorporated or not in a Cardiac Rehabilitation Program and age (p = 0.003), occupation (p = 0.010) and the entity paying the costs of program (p = 0.027). Conclusion: It was concluded that perceived self-efficacy and Pros and Againsts of the Decisional Balance are related to adherence to physical activity. Results also point out that age, profession and the entity paying the costs of the program influences the dropout of Cardiac Rehabilitation Programs.
Resumo:
Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.
Resumo:
In the trend towards tolerating hardware unreliability, accuracy is exchanged for cost savings. Running on less reliable machines, functionally correct code becomes risky and one needs to know how risk propagates so as to mitigate it. Risk estimation, however, seems to live outside the average programmer’s technical competence and core practice. In this paper we propose that program design by source-to-source transformation be risk-aware in the sense of making probabilistic faults visible and supporting equational reasoning on the probabilistic behaviour of programs caused by faults. This reasoning is carried out in a linear algebra extension to the standard, `a la Bird-Moor algebra of programming. This paper studies, in particular, the propagation of faults across standard program transformation techniques known as tupling and fusion, enabling the fault of the whole to be expressed in terms of the faults of its parts.
Resumo:
(Excerto) In times past, learning to read, write and do arithmetic was to get on course to earn the “writ of emancipation” in society. These skills are still essential today, but are not enough to live in society. Reading and critically understanding the world we live in, with all its complexity, difficulties and challenges, require not only other skills (learning to search for and validate information, reading with new codes and grammar, etc) but, to a certain extent, also metaskills, matrixes and mechanisms that are transversal to the different and new literacies, are necessary. They are needed not just to interpret but equally to communicate and participate in the little worlds that make up our everyday activities as well as, in a broader sense, in the world of the polis, which today is a global world.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
The Retirement Investors’ Club (RIC) (also referred to as 457/401(a) deferred compensation) is a voluntary retirement savings program designed to help you meet your need for income at retirement and lower your current income taxes. Your contributions to RIC are automatically withdrawn from your paycheck and you are credited with an employer match. You may enroll*and make changes at any time. Other advantages are explained below…keep reading about this excellent employee benefit!
Resumo:
Despite the advancement of phylogenetic methods to estimate speciation and extinction rates, their power can be limited under variable rates, in particular for clades with high extinction rates and small number of extant species. Fossil data can provide a powerful alternative source of information to investigate diversification processes. Here, we present PyRate, a computer program to estimate speciation and extinction rates and their temporal dynamics from fossil occurrence data. The rates are inferred in a Bayesian framework and are comparable to those estimated from phylogenetic trees. We describe how PyRate can be used to explore different models of diversification. In addition to the diversification rates, it provides estimates of the parameters of the preservation process (fossilization and sampling) and the times of speciation and extinction of each species in the data set. Moreover, we develop a new birth-death model to correlate the variation of speciation/extinction rates with changes of a continuous trait. Finally, we demonstrate the use of Bayes factors for model selection and show how the posterior estimates of a PyRate analysis can be used to generate calibration densities for Bayesian molecular clock analysis. PyRate is an open-source command-line Python program available at http://sourceforge.net/projects/pyrate/.
Resumo:
We have devised a program that allows computation of the power of F-test, and hence determination of appropriate sample and subsample sizes, in the context of the one-way hierarchical analysis of variance with fixed effects. The power at a fixed alternative is an increasing function of the sample size and of the subsample size. The program makes it easy to obtain the power of F-test for a range of values of sample and subsample sizes, and therefore the appropriate sizes based on a desired power. The program can be used for the 'ordinary' case of the one-way analysis of variance, as well as for hierarchical analysis of variance with two stages of sampling. Examples are given of the practical use of the program.
Resumo:
For several years now, substantial efforts have been devoted to the development and the implementation of a screening program for breast cancer in the Canton of Vaud. A four-year pilot phase is now starting, involving two regional hospitals with their catchment areas; women over 50 and under 70 years old will be invited to participate in the program. A double view mammography will be made, with a double reading made by the hospital radiologists; a third reading will be made in case of discrepancy between the two first radiologists. Patients classified as positive for screening (e.g., with a suspect radiological image) will be referred to their practitioner for further diagnosis and treatment. The medical and public health background of this program is discussed, more specifically the reasons for developing a screening program, the choice of mammography rather than other tools, and the need to implement screening as an organized program.
Resumo:
Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.