905 resultados para enduser programming, component, messagebased


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Aufgabenstellung, welche dieser Dissertation zugrunde liegt, lässt sich kurz als die Untersuchung von komponentenbasierten Konzepten zum Einsatz in der Softwareentwicklung durch Endanwender beschreiben. In den letzten 20 bis 30 Jahren hat sich das technische Umfeld, in dem ein Großteil der Arbeitnehmer seine täglichen Aufgaben verrichtet, grundlegend verändert. Der Computer, früher in Form eines Großrechners ausschließlich die Domäne von Spezialisten, ist nun ein selbstverständlicher Bestandteil der täglichen Arbeit. Der Umgang mit Anwendungsprogrammen, die dem Nutzer erlauben in einem gewissen Rahmen neue, eigene Funktionalität zu definieren, ist in vielen Bereichen so selbstverständlich, dass viele dieser Tätigkeiten nicht bewusst als Programmieren wahrgenommen werden. Da diese Nutzer nicht notwendigerweise in der Entwicklung von Software ausgebildet sind, benötigen sie entsprechende Unterstützung bei diesen Tätigkeiten. Dies macht deutlich, welche praktische Relevanz die Untersuchungen in diesem Bereich haben. Zur Erstellung eines Programmiersystems für Endanwender wird zunächst ein flexibler Anwendungsrahmen entwickelt, welcher sich als Basis zur Erstellung solcher Systeme eignet. In Softwareprojekten sind sich ändernde Anforderungen und daraus resultierende Notwendigkeiten ein wichtiger Aspekt. Dies wird im Entwurf des Frameworks durch Konzepte zur Bereitstellung von wieder verwendbarer Funktionalität durch das Framework und Möglichkeiten zur Anpassung und Erweiterung der vorhandenen Funktionalität berücksichtigt. Hier ist zum einen der Einsatz einer serviceorientierten Architektur innerhalb der Anwendung und zum anderen eine komponentenorientierte Variante des Kommando-Musters zu nennen. Zum anderen wird ein Konzept zur Kapselung von Endnutzerprogrammiermodellen in Komponenten erarbeitet. Dieser Ansatz ermöglicht es, unterschiedliche Modelle als Grundlage der entworfenen Entwicklungsumgebung zu verwenden. Im weiteren Verlauf der Arbeit wird ein Programmiermodell entworfen und unter Verwendung des zuvor genannten Frameworks implementiert. Damit dieses zur Nutzung durch Endanwender geeignet ist, ist eine Anhebung der zur Beschreibung eines Softwaresystems verwendeten Abstraktionsebene notwendig. Dies wird durch die Verwendung von Komponenten und einem nachrichtenbasierten Kompositionsmechanismus erreicht. Die vorgenommene Realisierung ist dabei noch nicht auf konkrete Anwendungsfamilien bezogen, diese Anpassungen erfolgen in einem weiteren Schritt für zwei unterschiedliche Anwendungsbereiche.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

[EN] Programming software for controlling robotic systems in order to built working systems that perform adequately according to their design requirements remains being a task that requires an important development effort. Currently, there are no clear programming paradigms for programming robotic systems, and the programming techniques which are of common use today are not adequate to deal with the complexity associated with these systems. The work presented in this document describes a programming tool, concretely a framework, that must be considered as a first step to devise a tool for dealing with the complexity present in robotics systems. In this framework the software that controls a system is viewed as a dynamic network of units of execution inter-connected by means of data paths. Each one of these units of execution, called a component, is a port automaton which provides a given functionality, hidden behind an external interface specifying clearly which data it needs and which data it produces. Components, once defined and built, may be instantiated, integrated and used as many times as needed in other systems. The framework provides the infrastructure necessary to support this concept for components and the inter communication between them by means of data paths (port connections) which can be established and de-established dynamically. Moreover, and considering that the more robust components that conform a system are, the more robust the system is, the framework provides the necessary infrastructure to control and monitor the components than integrate a system at any given instant of time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology for distribution networks reconfiguration in outage presence in order to choose the reconfiguration that presents the lower power losses. The methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. Once obtained the system states by Monte Carlo simulation, a logical programming algorithm is applied to get all possible reconfigurations for every system state. In order to evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation a distribution power flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology to a practical case, the paper includes a case study that considers a real distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper present a methodology to choose the distribution networks reconfiguration that presents the lower power losses. The proposed methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modeling for system component outage parameters. The proposed hybrid method using fuzzy sets and Monte Carlo simulation based on the fuzzyprobabilistic models allows catching both randomness and fuzziness of component outage parameters. A logic programming algorithm is applied, once obtained the system states by Monte Carlo Simulation, to get all possible reconfigurations for each system state. To evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation an AC load flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 115 buses distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managing programming exercises require several heterogeneous systems such as evaluation engines, learning objects repositories and exercise resolution environments. The coordination of networks of such disparate systems is rather complex. These tools would be too specific to incorporate in an e-Learning platform. Even if they could be provided as pluggable components, the burden of maintaining them would be prohibitive to institutions with few courses in those domains. This work presents a standard based approach for the coordination of a network of e-Learning systems participating on the automatic evaluation of programming exercises. The proposed approach uses a pivot component to orchestrate the interaction among all the systems using communication standards. This approach was validated through its effective use on classroom and we present some preliminary results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Informática.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CATCH Kids Club (CKC) is an after-school intervention that has attempted to address the growing obesity and physical inactivity concerns publicized in current literature. Using Self-Determination Theory (SDT: Deci & Ryan, 1985) perspective, this study's main research objective was to assess, while controlling for gender and age, i f there were significant differences between the treatment (CKC program participants) and control (non- eKC) groups on their perceptions of need satisfaction, intrinsic motivation and optimal challenge after four months of participation and after eight months of participation. For this study, data were collected from 79 participants with a mean age of9.3, using the Situational Affective State Questionnaire (SASQ: Mandigo et aI., 2008). In order to determine the common factors present in the data, a principal component analysis was conducted. The analysis resulted in an appropriate three-factor solution, with 14 items loading onto the three factors identified as autonomy, competence and intrinsic motivation. Initially, a multiple analysis of co-variance (MANCOY A) was conducted and found no significant differences or effects (p> 0.05). To further assess the differences between groups, six analyses of co-variance (ANeOY As) were conducted, which also found no significant differences (p >0 .025). These findings suggest that the eKC program is able to maintain the se1fdetermined motivational experiences of its participants, and does not thwart need satisfaction or self-determined motivation through its programming. However, the literature suggests that the CKe program and other P A interventions could be further improved by fostering participants' self-determined motivational experiences, which can lead to the persistence of healthy PA behaviours (Kilpatrick, Hebert & Jacobsen, 2002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model is developed to represent the activity of a farm using the method of linear programming. Two are the main components of the model, the balance of soil fertility and the livestock nutrition. According to the first, the farm is supposed to have a total requirement of nitrogen, which is to be accomplished either through internal sources (manure) or through external sources (fertilisers). The second component describes the animal husbandry as having a nutritional requirement which must be satisfied through the internal production of arable crops or the acquisition of feed from the market. The farmer is supposed to maximise total net income from the agricultural and the zoo-technical activities by choosing one rotation among those available for climate and acclivity. The perspective of the analysis is one of a short period: the structure of the farm is supposed to be fixed without possibility to change the allocation of permanent crops and the amount of animal husbandry. The model is integrated with an environmental module that describes the role of the farm within the carbon-nitrogen cycle. On the one hand the farm allows storing carbon through the photosynthesis of the plants and the accumulation of carbon in the soil; on the other some activities of the farm emit greenhouse gases into the atmosphere. The model is tested for some representative farms of the Emilia-Romagna region, showing to be capable to give different results for conventional and organic farming and providing first results concerning the different atmospheric impact. Relevant data about the representative farms and the feasible rotations are extracted from the FADN database, with an integration of the coefficients from the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The United States¿ Federal and State laws differentiate between acceptable (or, legal) and unacceptable (illegal) behavior by prescribing restrictive punishment to citizens and/or groups that violate these established rules. These regulations are written to treat every person equally and to fairly serve justice; furthermore, the sanctions placed on offenders seek to reform illegal behavior through limitations on freedoms and rehabilitative programs. Despite the effort to treat all offenders fairly regardless of social identity categories (e.g., sex, race, ethnicity, socioeconomic status, age, ability, and gender and sexual orientation) and to humanely eliminate illegal behavior, the American penal system perpetuates de facto discrimination against a multitude of peoples. Furthermore, soaring recidivism rates caused by unsuccessful re-entry of incarcerated offenders puts economic stress on Federal and State budgets. For these reasons, offenders, policy-makers, and law-abiding citizens should all have a vested interest in reforming the prison system. This thesis focuses on the failure of the United States corrections system to adequately address the gender-specific needs of non-violent female offenders. Several factors contribute to the gender-specific discrimination that women experience in the criminal justice system: 1) Trends in female criminality that skew women¿s crime towards drug-related crimes, prostitution, and property offenses; 2) Mandatory minimum sentences for drug crimes that are disproportionate to the crime committed; 3) So-called ¿gender-neutral¿ educational, vocational, substance abuse, and mental health programming that intends to equally rehabilitate men and women, but in fact favors men; and 4) The isolating nature of prison structures that inhibits smooth re-entry into society. I argue that a shift in the placement and treatment of non-violent female offenders is necessary for effective rehabilitation and for reducing recidivism rates. The first component of this shift is the design and implementation of gender- responsive treatment (GRT) rather than gender-neutral approaches in rehabilitative programming. The second shift is the utilization of alternatives to incarceration, which provide both more humane treatment of offenders and smoother reintegration to society. Drawing on recent scholarship, information from prison advocacy organizations, and research with men in an alternative program, I provide a critical analysis of current policies and alternative programs, and suggest several proposals for future gender- responsive programs in prisons and in place of incarceration. I argue that the expansion of gender-responsive programming and alternatives to incarceration respond to the marginalization of female offenders, address concerns about the financial sustainability of the United States criminal justice system, and tackle high recidivism rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Community-based participatory research necessitates that community members act as partners in decision making and mutual learning and discovery. In the same light, for programs/issues involving youth, youth should be partners in knowledge sharing and evaluation (Checkoway & Richards-Schuster, 2004). This study is a youth-focused empowerment evaluation for the Successful Youth program. Successful Youth is a multi-component youth development after-school program for Latino middle school youth, created with the goal of reducing teen pregnancy. An empowerment evaluation is collaborative and participatory (Balcazar and Harper 2003). The three steps of an empowerment evaluation are: (1) defining mission, (2) taking stock, and (3) planning for the future (Fetterman 2001).^ In a program where youth are developing leadership skills, making choices, and learning how to self reflect and evaluate, the empowerment evaluation could not be more aligned with promoting and enhancing these skills. In addition, an empowerment evaluation is designed to "foster improvement and self-determination" and "build capacity" (Fetterman 2001). Four empowerment groups were conducted with approximately 6-9 Latino 7th grade students per group. All participants were enrolled in the Successful Youth program. Results indicate points where students' perceptions of the program were aligned with the program's mission and where gaps were identified. Students offered recommendations for program improvements. Additionally, students enjoyed expressing their feelings about the program and appreciated that their opinions were valued. Youth recommendations will be brought to program staff; and, where possible, gaps will be addressed. Empowerment evaluations with youth will continue during the duration of the program so that youth involvement and input remains integral in the evaluation and to ascertain whether the program's goals are being met. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of chemical processes where the flowsheet topology is not kept fixed is a challenging discrete-continuous optimization problem. Usually, this task has been performed through equation based models. This approach presents several problems, as tedious and complicated component properties estimation or the handling of huge problems (with thousands of equations and variables). We propose a GDP approach as an alternative to the MINLP models coupled with a flowsheet program. The novelty of this approach relies on using a commercial modular process simulator where the superstructure is drawn directly on the graphical use interface of the simulator. This methodology takes advantage of modular process simulators (specially tailored numerical methods, reliability, and robustness) and the flexibility of the GDP formulation for the modeling and solution. The optimization tool proposed is successfully applied to the synthesis of a methanol plant where different alternatives are available for the streams, equipment and process conditions.