948 resultados para Modula-2 (Computer program language)


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Observability measures the support of computer systems to accurately capture, analyze, and present (collectively observe) the internal information about the systems. Observability frameworks play important roles for program understanding, troubleshooting, performance diagnosis, and optimizations. However, traditional solutions are either expensive or coarse-grained, consequently compromising their utility in accommodating today’s increasingly complex software systems. New solutions are emerging for VM-based languages due to the full control language VMs have over program executions. Existing such solutions, nonetheless, still lack flexibility, have high overhead, or provide limited context information for developing powerful dynamic analyses. In this thesis, we present a VM-based infrastructure, called marker tracing framework (MTF), to address the deficiencies in the existing solutions for providing better observability for VM-based languages. MTF serves as a solid foundation for implementing fine-grained low-overhead program instrumentation. Specifically, MTF allows analysis clients to: 1) define custom events with rich semantics ; 2) specify precisely the program locations where the events should trigger; and 3) adaptively enable/disable the instrumentation at runtime. In addition, MTF-based analysis clients are more powerful by having access to all information available to the VM. To demonstrate the utility and effectiveness of MTF, we present two analysis clients: 1) dynamic typestate analysis with adaptive online program analysis (AOPA); and 2) selective probabilistic calling context analysis (SPCC). In addition, we evaluate the runtime performance of MTF and the typestate client with the DaCapo benchmarks. The results show that: 1) MTF has acceptable runtime overhead when tracing moderate numbers of marker events; and 2) AOPA is highly effective in reducing the event frequency for the dynamic typestate analysis; and 3) language VMs can be exploited to offer greater observability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Time correlation functions of current fluctuations were calculated by molecular dynamics (MD) simulations in order to investigate sound waves of high wavevectors in the glass-forming liquid Ca(NO3)(2)center dot 4H(2)O. Dispersion curves, omega(k), were obtained for longitudinal (LA) and transverse acoustic (TA) modes, and also for longitudinal optic (LO) modes. Spectra of LA modes calculated by MD simulations were modeled by a viscoelastic model within the memory function framework. The viscoelastic model is used to rationalize the change of slope taking place at k similar to 0.3 angstrom(-1) in the omega(k) curve of acoustic modes. For still larger wavevectors, mixing of acoustic and optic modes is observed. Partial time correlation functions of longitudinal mass currents were calculated separately for the ions and the water molecules. The wavevector dependence of excitation energies of the corresponding partial LA modes indicates the coexistence of a relatively stiff subsystem made of cations and anions, and a softer subsystem made of water molecules. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4751548]

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Breast cancer is the most frequently diagnosed cancer and the leading cause of cancer deaths among women worldwide. The use of mobile mammography units to offer screening to women living in remote areas is a rational strategy to increase the number of women examined. This study aimed to evaluate results from the first 2 years of a government-organized mammography screening program implemented with a mobile unit (MU) and a fixed unit (FU) in a rural county in Brazil. The program offered breast cancer screening to women living in Barretos and the surrounding area. Methods: Based on epidemiologic data, 54 238 women, aged 40 to 69 years, were eligible for breast cancer screening. The study included women examined from April 1, 2003 to March 31, 2005. The chi-square test and Bonferroni correction analyses were used to evaluate the frequencies of tumors and the importance of clinical parameters and tumor characteristics. Significance was set at p < 0.05. Results: Overall, 17 964 women underwent mammography. This represented 33.1% of eligible women in the area. A mean of 18.6 and 26.3 women per day were examined in the FU and MU, respectively. Seventy six patients were diagnosed with breast cancer (41 (54%) in the MU). This represented 4.2 cases of breast cancer per 1000 examinations. The number of cancers detected was significantly higher in women aged 60 to 69 years than in those aged 50 to 59 years (p < 0.001) or 40 to 49 years (p < 0.001). No difference was observed between women aged 40 to 49 years and those aged 50 to 59 years (p = 0.164). The proportion of tumors in the early (EC 0 and EC I) and advanced (CS III and CS IV) stages of development were 43.4% and 15.8%, respectively. Conclusions: Preliminary results indicate that this mammography screening program is feasible for implementation in a rural Brazilian territory and favor program continuation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This qualitative, exploratory, descriptive study was performed with the objective of understanding the perception of the nurses working in medical-surgical units of a university hospital, regarding the strategies developed to perform a pilot test of the PROCEnf-USP electronic system, with the purpose of computerizing clinical nursing documentation. Eleven nurses of a theoretical-practical training program were interviewed and the obtained data were analyzed using the Content Analysis Technique. The following categories were discussed based on the references of participative management and planned changes: favorable aspects for the implementation; unfavorable aspects for the implementation; and expectations regarding the implementation. According to the nurses' perceptions, the preliminary use of the electronic system allowed them to show their potential and to propose improvements, encouraging them to become partners of the group manager in the dissemination to other nurses of the institution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study evaluated the impact of a participatory program to reduce noise in a neonatal intermediate care unit of a university hospital. A time-series quasi-experimental design was used, in which sound pressure levels were measured before and after the intervention was implemented using the Quest-400 dosimeter. Non-parametric statistical tests were used to compare noise with the level of significance fixed at 5%. Results showed significant reduction of sound pressure levels in the neonatal unit after the intervention program was implemented (p<0.0001). The average Leq before the intervention was 62.5dBA and was reduced to 58.8dBA after the intervention. A reduction of 7.1dBA in the average Lmax(from 104.8 to 87.7dBA) and of 30.6dBA in the average Lpeak(from 138.1 to 107.5dBA) was observed. The program was proven to be effective in significantly reducing noise levels in the neonatal unit, although levels were still more intense than recommended.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis intends to investigate two aspects of Constraint Handling Rules (CHR). It proposes a compositional semantics and a technique for program transformation. CHR is a concurrent committed-choice constraint logic programming language consisting of guarded rules, which transform multi-sets of atomic formulas (constraints) into simpler ones until exhaustion [Frü06] and it belongs to the declarative languages family. It was initially designed for writing constraint solvers but it has recently also proven to be a general purpose language, being as it is Turing equivalent [SSD05a]. Compositionality is the first CHR aspect to be considered. A trace based compositional semantics for CHR was previously defined in [DGM05]. The reference operational semantics for such a compositional model was the original operational semantics for CHR which, due to the propagation rule, admits trivial non-termination. In this thesis we extend the work of [DGM05] by introducing a more refined trace based compositional semantics which also includes the history. The use of history is a well-known technique in CHR which permits us to trace the application of propagation rules and consequently it permits trivial non-termination avoidance [Abd97, DSGdlBH04]. Naturally, the reference operational semantics, of our new compositional one, uses history to avoid trivial non-termination too. Program transformation is the second CHR aspect to be considered, with particular regard to the unfolding technique. Said technique is an appealing approach which allows us to optimize a given program and in more detail to improve run-time efficiency or spaceconsumption. Essentially it consists of a sequence of syntactic program manipulations which preserve a kind of semantic equivalence called qualified answer [Frü98], between the original program and the transformed ones. The unfolding technique is one of the basic operations which is used by most program transformation systems. It consists in the replacement of a procedure-call by its definition. In CHR every conjunction of constraints can be considered as a procedure-call, every CHR rule can be considered as a procedure and the body of said rule represents the definition of the call. While there is a large body of literature on transformation and unfolding of sequential programs, very few papers have addressed this issue for concurrent languages. We define an unfolding rule, show its correctness and discuss some conditions in which it can be used to delete an unfolded rule while preserving the meaning of the original program. Finally, confluence and termination maintenance between the original and transformed programs are shown. This thesis is organized in the following manner. Chapter 1 gives some general notion about CHR. Section 1.1 outlines the history of programming languages with particular attention to CHR and related languages. Then, Section 1.2 introduces CHR using examples. Section 1.3 gives some preliminaries which will be used during the thesis. Subsequentely, Section 1.4 introduces the syntax and the operational and declarative semantics for the first CHR language proposed. Finally, the methodologies to solve the problem of trivial non-termination related to propagation rules are discussed in Section 1.5. Chapter 2 introduces a compositional semantics for CHR where the propagation rules are considered. In particular, Section 2.1 contains the definition of the semantics. Hence, Section 2.2 presents the compositionality results. Afterwards Section 2.3 expounds upon the correctness results. Chapter 3 presents a particular program transformation known as unfolding. This transformation needs a particular syntax called annotated which is introduced in Section 3.1 and its related modified operational semantics !0t is presented in Section 3.2. Subsequently, Section 3.3 defines the unfolding rule and prove its correctness. Then, in Section 3.4 the problems related to the replacement of a rule by its unfolded version are discussed and this in turn gives a correctness condition which holds for a specific class of rules. Section 3.5 proves that confluence and termination are preserved by the program modifications introduced. Finally, Chapter 4 concludes by discussing related works and directions for future work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In computer systems, specifically in multithread, parallel and distributed systems, a deadlock is both a very subtle problem - because difficult to pre- vent during the system coding - and a very dangerous one: a deadlocked system is easily completely stuck, with consequences ranging from simple annoyances to life-threatening circumstances, being also in between the not negligible scenario of economical losses. Then, how to avoid this problem? A lot of possible solutions has been studied, proposed and implemented. In this thesis we focus on detection of deadlocks with a static program analysis technique, i.e. an analysis per- formed without actually executing the program. To begin, we briefly present the static Deadlock Analysis Model devel- oped for coreABS−− in chapter 1, then we proceed by detailing the Class- based coreABS−− language in chapter 2. Then, in Chapter 3 we lay the foundation for further discussions by ana- lyzing the differences between coreABS−− and ASP, an untyped Object-based calculi, so as to show how it can be possible to extend the Deadlock Analysis to Object-based languages in general. In this regard, we explicit some hypotheses in chapter 4 first by present- ing a possible, unproven type system for ASP, modeled after the Deadlock Analysis Model developed for coreABS−−. Then, we conclude our discussion by presenting a simpler hypothesis, which may allow to circumvent the difficulties that arises from the definition of the ”ad-hoc” type system discussed in the aforegoing chapter.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Written text is an important component in the process of knowledge acquisition and communication. Poorly written text fails to deliver clear ideas to the reader no matter how revolutionary and ground-breaking these ideas are. Providing text with good writing style is essential to transfer ideas smoothly. While we have sophisticated tools to check for stylistic problems in program code, we do not apply the same techniques for written text. In this paper we present TextLint, a rule-based tool to check for common style errors in natural language. TextLint provides a structural model of written text and an extensible rule-based checking mechanism.