968 resultados para object-oriented programming
Resumo:
Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.
Resumo:
Given a fixed set of identical or different-sized circular items, the problem we deal with consists on finding the smallest object within which the items can be packed. Circular, triangular, squared, rectangular and also strip objects are considered. Moreover, 2D and 3D problems are treated. Twice-differentiable models for all these problems are presented. A strategy to reduce the complexity of evaluating the models is employed and, as a consequence, instances with a large number of items can be considered. Numerical experiments show the flexibility and reliability of the new unified approach. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Models of different degrees of complexity are found in the literature for the estimation of lightning striking distances and attractive radius of objects and structures. However, besides the oversimplifications of the physical nature of the lightning discharge on which most of them are based, till recently the tridimensional structure configuration could not be considered. This is an important limitation, as edges and other details of the object affect the electric field and, consequently, the upward leader initiation. Within this context, the Self-consistent leader initiation and propagation model (SLIM) proposed by Becerra and Cooray is state-of-the-art leader inception and propagation leader model based on the physics of leader discharges which enables the tridimensional geometry of the structure to be taken into account. In this paper, the model is used for estimating the striking distance and attractive radius of power transmission lines. The results are compared with those obtained from the electrogeometric and Eriksson's models. © 2003-2012 IEEE.
Resumo:
In this letter, a semiautomatic method for road extraction in object space is proposed that combines a stereoscopic pair of low-resolution aerial images with a digital terrain model (DTM) structured as a triangulated irregular network (TIN). First, we formulate an objective function in the object space to allow the modeling of roads in 3-D. In this model, the TIN-based DTM allows the search for the optimal polyline to be restricted along a narrow band that is overlaid upon it. Finally, the optimal polyline for each road is obtained by optimizing the objective function using the dynamic programming optimization algorithm. A few seed points need to be supplied by an operator. To evaluate the performance of the proposed method, a set of experiments was designed using two stereoscopic pairs of low-resolution aerial images and a TIN-based DTM with an average resolution of 1 m. The experimental results showed that the proposed method worked properly, even when faced with anomalies along roads, such as obstructions caused by shadows and trees.
Resumo:
Service Oriented Computing is a new programming paradigm for addressing distributed system design issues. Services are autonomous computational entities which can be dynamically discovered and composed in order to form more complex systems able to achieve different kinds of task. E-government, e-business and e-science are some examples of the IT areas where Service Oriented Computing will be exploited in the next years. At present, the most credited Service Oriented Computing technology is that of Web Services, whose specifications are enriched day by day by industrial consortia without following a precise and rigorous approach. This PhD thesis aims, on the one hand, at modelling Service Oriented Computing in a formal way in order to precisely define the main concepts it is based upon and, on the other hand, at defining a new approach, called bipolar approach, for addressing system design issues by synergically exploiting choreography and orchestration languages related by means of a mathematical relation called conformance. Choreography allows us to describe systems of services from a global view point whereas orchestration supplies a means for addressing such an issue from a local perspective. In this work we present SOCK, a process algebra based language inspired by the Web Service orchestration language WS-BPEL which catches the essentials of Service Oriented Computing. From the definition of SOCK we will able to define a general model for dealing with Service Oriented Computing where services and systems of services are related to the design of finite state automata and process algebra concurrent systems, respectively. Furthermore, we introduce a formal language for dealing with choreography. Such a language is equipped with a formal semantics and it forms, together with a subset of the SOCK calculus, the bipolar framework. Finally, we present JOLIE which is a Java implentation of a subset of the SOCK calculus and it is part of the bipolar framework we intend to promote.
Resumo:
In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.
Resumo:
Actual trends in software development are pushing the need to face a multiplicity of diverse activities and interaction styles characterizing complex and distributed application domains, in such a way that the resulting dynamics exhibits some grade of order, i.e. in terms of evolution of the system and desired equilibrium. Autonomous agents and Multiagent Systems are argued in literature as one of the most immediate approaches for describing such a kind of challenges. Actually, agent research seems to converge towards the definition of renewed abstraction tools aimed at better capturing the new demands of open systems. Besides agents, which are assumed as autonomous entities purposing a series of design objectives, Multiagent Systems account new notions as first-class entities, aimed, above all, at modeling institutional/organizational entities, placed for normative regulation, interaction and teamwork management, as well as environmental entities, placed as resources to further support and regulate agent work. The starting point of this thesis is recognizing that both organizations and environments can be rooted in a unifying perspective. Whereas recent research in agent systems seems to account a set of diverse approaches to specifically face with at least one aspect within the above mentioned, this work aims at proposing a unifying approach where both agents and their organizations can be straightforwardly situated in properly designed working environments. In this line, this work pursues reconciliation of environments with sociality, social interaction with environment based interaction, environmental resources with organizational functionalities with the aim to smoothly integrate the various aspects of complex and situated organizations in a coherent programming approach. Rooted in Agents and Artifacts (A&A) meta-model, which has been recently introduced both in the context of agent oriented software engineering and programming, the thesis promotes the notion of Embodied Organizations, characterized by computational infrastructures attaining a seamless integration between agents, organizations and environmental entities.
Resumo:
Unterstützungssysteme für die Programmierausbildung sind weit verbreitet, doch gängige Standards für den Austausch von allgemeinen (Lern-) Inhalten und Tests erfüllen nicht die speziellen Anforderungen von Programmieraufgaben wie z. B. den Umgang mit komplexen Einreichungen aus mehreren Dateien oder die Kombination verschiedener (automatischer) Bewertungsverfahren. Dadurch können Aufgaben nicht zwischen Systemen ausgetauscht werden, was aufgrund des hohen Aufwands für die Entwicklung guter Aufgaben jedoch wünschenswert wäre. In diesem Beitrag wird ein erweiterbares XML-basiertes Format zum Austausch von Programmieraufgaben vorgestellt, das bereits von mehreren Systemen prototypisch genutzt wird. Die Spezifikation des Austauschformats ist online verfügbar [PFMA].
Resumo:
The investigator conducted an action-oriented investigation of pregnancy and birth among the women of Mesa los Hornos, an urban squatter slum in Mexico City. Three aims guided the project: (1) To obtain information for improving prenatal and maternity service utilization; (2) To examine the utility of rapid ethnographic and epidemiologic assessment methodologies; (3) To cultivate community involvement in health development.^ Viewing service utilization as a culturally-bound decision, the study included a qualitative phase to explore women's cognition of pregnancy and birth, their perceived needs during pregnancy, and their criteria of service acceptability. A probability-based community survey delineated parameters of service utilization and pregnancy health events, and probed reasons for decisions to use medical services, lay midwives, or other sources of prenatal and labor and delivery assistance. Qualitative survey of service providers at relevant clinics, hospitals, and practices contributed information on service availability and access, and on coordination among private, social security, and public assistance health service sectors. The ethnographic approach to exploring the rationale for use or non-use of services provided a necessary complement to conventional barrier-based assessment, to inform planning of culturally appropriate interventions.^ Information collection and interpretation was conducted under the aegis of an advisory committee of community residents and service agency representatives; the residents' committee formulated recommendations for action based on findings, and forwarded the mandate to governmental social and urban development offices. Recommendations were designed to inform and develop community participation in health care decision-making.^ Rapid research methods are powerful tools for achieving community-based empowerment toward investigation and resolution of local health problems. But while ethnography works well in synergy with quantitative assessment approaches to strengthen the validity and richness of short-term field work, the author strongly urges caution in application of Rapid Ethnographic Assessments. An ethnographic sensibility is essential to the research enterprise for the development of an active and cooperative community base, the design and use of quantitative instruments, the appropriate use of qualitative techniques, and the interpretation of culturally-oriented information. However, prescribed and standardized Rapid Ethnographic Assessment techniques are counter-productive if used as research short-cuts before locale- and subject-specific cultural understanding is achieved. ^
Resumo:
On that date , the Spanish affiliate offered for the 1st. time to its customers courses oriented toward the user, not the product, within the area of programming, in the subarea of application programming
Resumo:
We introduce a dominance intensity measuring method to derive a ranking of alternatives to deal with incomplete information in multi-criteria decision-making problems on the basis of multi-attribute utility theory (MAUT) and fuzzy sets theory. We consider the situation where there is imprecision concerning decision-makers’ preferences, and imprecise weights are represented by trapezoidal fuzzy weights.The proposed method is based on the dominance values between pairs of alternatives. These values can be computed by linear programming, as an additive multi-attribute utility model is used to rate the alternatives. Dominance values are then transformed into dominance intensity measures, used to rank the alternatives under consideration. Distances between fuzzy numbers based on the generalization of the left and right fuzzy numbers are utilized to account for fuzzy weights. An example concerning the selection of intervention strategies to restore an aquatic ecosystem contaminated by radionuclides illustrates the approach. Monte Carlo simulation techniques have been used to show that the proposed method performs well for different imprecision levels in terms of a hit ratio and a rank-order correlation measure.
Resumo:
Histograms of Oriented Gradients (HoGs) provide excellent results in object detection and verification. However, their demanding processing requirements bound their applicability in some critical real-time scenarios, such as for video-based on-board vehicle detection systems. In this work, an efficient HOG configuration for pose-based on-board vehicle verification is proposed, which alleviates both the processing requirements and required feature vector length without reducing classification performance. The impact on classification of some critical configuration and processing parameters is in depth analyzed to propose a baseline efficient descriptor. Based on the analysis of its cells contribution to classification, new view-dependent cell-configuration patterns are proposed, resulting in reduced descriptors which provide an excellent balance between performance and computational requirements, rendering higher verification rates than other works in the literature.
Resumo:
We present a general approach to forming structure-activity relationships (SARs). This approach is based on representing chemical structure by atoms and their bond connectivities in combination with the inductive logic programming (ILP) algorithm PROGOL. Existing SAR methods describe chemical structure by using attributes which are general properties of an object. It is not possible to map chemical structure directly to attribute-based descriptions, as such descriptions have no internal organization. A more natural and general way to describe chemical structure is to use a relational description, where the internal construction of the description maps that of the object described. Our atom and bond connectivities representation is a relational description. ILP algorithms can form SARs with relational descriptions. We have tested the relational approach by investigating the SARs of 230 aromatic and heteroaromatic nitro compounds. These compounds had been split previously into two subsets, 188 compounds that were amenable to regression and 42 that were not. For the 188 compounds, a SAR was found that was as accurate as the best statistical or neural network-generated SARs. The PROGOL SAR has the advantages that it did not need the use of any indicator variables handcrafted by an expert, and the generated rules were easily comprehensible. For the 42 compounds, PROGOL formed a SAR that was significantly (P < 0.025) more accurate than linear regression, quadratic regression, and back-propagation. This SAR is based on an automatically generated structural alert for mutagenicity.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06