60 resultados para Subroutines in Procedural Programming Languages
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
Studies on ethics in information organization have deeply contributed to the recognition of the social dimension of Information Science. The subject approach to information is linked to an ethical dimension because one of its major concerns is related to its reliability and usefulness in a specific discursive community or knowledge domain. In this direction, we propose, through an exploratory research design with qualitative and inductive characteristics, to identify the specific terminology that Brazilian indexing languages allow for terms relating to male homosexuality. We also analyzed the terms assigned to papers published in the Journal of Homosexuality, Sexualities and Journal of Gay & Lesbian Mental Health between the years 2005 to 2009. From this analysis of terms and the Brazilian indexing languages, we see (1) the Brazilian context, (2) imprecision in the terminology, (3) indications of prejudices disseminated by political correctness, (4) biased representation of the subject matter, (5) and the presence of figures of speech.
Resumo:
This article investigates which semantic categories, as defined in Functional Discourse Grammar, formally manifest themselves in a sample of native languages of Brazil, and the extent to which the distribution of these manifestations across categories can be described systematically in terms of implicational hierarchies. The areas subjected to investigation are basic interrogative words, basic demonstrative words, and nominalization strategies.
Resumo:
This paper shows that the distribution of basic illocutions (defined as grammatical structures that can be related to a default communicative intentions) within and across the indigenous languages of Brazil can be described systematically in terms of a set of implicational hierarchies by means of which the existence of certain basic illocutions can be predicted from the existence of others. In doing so, a case is made for a major distinction between propositional and behavioural basic illocutions, the former having to do with the exchange of information, the latter with influencing behaviour.
Resumo:
The process of knowledge representation as well as its procedures or tools and its products are not neutral in terms of values; instead they imply moral values. In this context, bias in representation related to prejudice and discrimination, to gender issues, to dicotomic categorization in classification systems or in thesauri and to lack of cultural warrant may arise. Concerning the problem of bias in indexing languages, starting from the initial theoretical reflexions of Brey (1999), Berman (1993), Olson (1998; 2002), Lopez-Huertas Perez & Torres Ramirez (2005), Guimaraes (2006), Hjorland (2008) and Milani et al. (2009), the proposal is to present a preliminary categorization aiming at facilitating the identification of bias concerning feminine issues in indexing languages, to offer a contribution to the theoretical universe of the specific questions of knowledge organization and to present a theme to be discussed by educators and professionals in the areas of cataloging, classification and indexing. If in a society which intends to be politically correct, social attitudes towards stigmatized citizens should be modified, then, the universe of indexing languages, taken as tools of knowledge representation, is a fertile field to sow this reflexion.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this work an image pre-processing module has been developed to extract quantitative information from plantation images with various degrees of infestation. Four filters comprise this module: the first one acts on smoothness of the image, the second one removes image background enhancing plants leaves, the third filter removes isolated dots not removed by the previous filter, and the fourth one is used to highlight leaves' edges. At first the filters were tested with MATLAB, for a quick visual feedback of the filters' behavior. Then the filters were implemented in the C programming language. At last, the module as been coded in VHDL for the implementation on a Stratix II family FPGA. Tests were run and the results are shown in this paper. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
Since Sharir and Pnueli, algorithms for context-sensitivity have been defined in terms of 'valid' paths in an interprocedural flow graph. The definition of valid paths requires atomic call and ret statements, and encapsulated procedures. Thus, the resulting algorithms are not directly applicable when behavior similar to call and ret instructions may be realized using non-atomic statements, or when procedures do not have rigid boundaries, such as with programs in low level languages like assembly or RTL. We present a framework for context-sensitive analysis that requires neither atomic call and ret instructions, nor encapsulated procedures. The framework presented decouples the transfer of control semantics and the context manipulation semantics of statements. A new definition of context-sensitivity, called stack contexts, is developed. A stack context, which is defined using trace semantics, is more general than Sharir and Pnueli's interprocedural path based calling-context. An abstract interpretation based framework is developed to reason about stack-contexts and to derive analogues of calling-context based algorithms using stack-context. The framework presented is suitable for deriving algorithms for analyzing binary programs, such as malware, that employ obfuscations with the deliberate intent of defeating automated analysis. The framework is used to create a context-sensitive version of Venable et al.'s algorithm for analyzing x86 binaries without requiring that a binary conforms to a standard compilation model for maintaining procedures, calls, and returns. Experimental results show that a context-sensitive analysis using stack-context performs just as well for programs where the use of Sharir and Pnueli's calling-context produces correct approximations. However, if those programs are transformed to use call obfuscations, a contextsensitive analysis using stack-context still provides the same, correct results and without any additional overhead. © Springer Science+Business Media, LLC 2011.
Resumo:
This paper proposes a heuristic constructive multi-start algorithm (HCMA) to distribution system restoration in real time considering distributed generators installed in the system. The problem is modeled as nonlinear mixed integer and considers the two main goals of the restoration of distribution networks: minimizing the number of consumers without power and the number of switching. The proposed algorithm is implemented in C++ programming language and tested using a large real-life distribution system. The results show that the proposed algorithm is able to provide a set of feasible and good quality solutions in a suitable time for the problem. © 2011 IEEE.
Resumo:
The present article describes the challenges programming apprentices face and identifies the elements and processes that set them apart from experienced programmers. And also explains why a conventional programming languages teaching approach fails to map the programming mental model. The purpose of this discussion is to benefit from ideas and cognitive philosophies to be embedded in programming learning tools. Cognitive components are modeled as elements to be handled by the apprentices in tutoring systems while performing a programming task. In this process a mental level solution (the mental model of the program) and an implementation level solution (the program) are created. The mapping between these representations is a path followed by the student explicitly in this approach. © 2011 IEEE.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
IEEE 1451 Standard is intended to address the smart transducer interfacing problematic in network environments. Usually, proprietary hardware and software is a very efficient solution to in planent the IEEE 1451 normative, although can be expensive and inflexible. In contrast, the use of open and standardized tools for implementing the IEEE 1451 normative is proposed in this paper. Tools such as Java and Phyton programming languages, Linux, programmable logic technology, Personal Computer resources and Ethernet architecture were integrated in order to constructa network node based on the IEEE 1451 standards. The node can be applied in systems based on the client-server communication model The evaluation of the employed tools and expermental results are presented. © 2005 IEEE.
Resumo:
This paper presents the overall methodology that has been used to encode both the Brazilian Portuguese WordNet (WordNet.Br) standard language-independent conceptual-semantic relations (hyponymy, co-hyponymy, meronymy, cause, and entailment) and the so-called cross-lingual conceptual-semantic relations between different wordnets. Accordingly, after contextualizing the project and outlining the current lexical database structure and statistics, it describes the WordNet.Br editing GUI that was designed to aid the linguist in carrying out the tasks of building synsets, selecting sample sentences from corpora, writing synset concept glosses, and encoding both language-independent conceptual-semantic relations and cross-lingual conceptual-semantic relations between WordNet.Br and Princeton WordNet © Springer-Verlag Berlin Heidelberg 2006.
Resumo:
Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties, and in some cases rewards, which introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the Maximum Continuous Interruption Duration per customer (MCID). This paper describes a chronological Monte Carlo simulation approach to evaluate probability distributions of reliability indices, including the MCID, and the corresponding penalties. In order to get the desired efficiency, modern computational techniques are used for modeling (UML -Unified Modeling Language) as well as for programming (Object- Oriented Programming). Case studies on a simple distribution network and on real Brazilian distribution systems are presented and discussed. © Copyright KTH 2006.
Resumo:
ArcTech is a software being developed, applied and improved with the aim of becoming an efficient sensitization tool to support the teaching-learning process of Architecture courses. The application deals initially with the thermal comfort of buildings. The output generated by the software shows if a student is able to produce a pleasant environment, in terms of thermal sensation along a 24-hours period. Although one can find the very same characteristics in fully-developed commercial software, the reason to create ArcTech is related to the flexibility of the system to be adapted by the instructor and also to the need of simple tools for the evaluation of specific topics along the courses. The first part of ArcTech is dedicated to data management and that was developed using the visual programming language Delphi 7 and Firebird as the database management system. The second part contains the parameters that can be changed by the system administrator and those related to project visualization. The interface of the system, in which the student will learn how to implement and to evaluate the project alternatives, was built using Macromedia Flash. The software was applied to undergraduate students revealing its easy-learning and easy-teaching interface.
Resumo:
This work describes a control and supervision application takes into account the virtual instrumentation advantages to control and supervision industrial manufacturing stations belonging to the modular production system MPS® by Festo. These stations integrate sensors, actuators, conveyor belt and other industrial elements. The focus in this approach was to replace the use of programmable logic controllers by a computer equipped with a software application based on Labview and, together, performs the functions of traditional instruments and PLCs. The manufacturing stations had their processes modeled and simulated in Petri nets. After the models were implemented in Labview environment. Tests and previous similar works in MPS® installed in Automation Laboratory, at UNESP Sorocaba campus, showed the materials and methods used in this work allow the successful use of virtual instrumentation. The results indicate the technology as an advantageous approach for the automation of industrial processes, with gains in flexibility and reduction in project cost. © 2011 IEEE.