954 resultados para implementation issues


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This report addresses speculative parallelism (the assignment of spare processing resources to tasks which are not known to be strictly required for the successful completion of a computation) at the user and application level. At this level, the execution of a program is seen as a (dynamic) tree —a graph, in general. A solution for a problem is a traversal of this graph from the initial state to a node known to be the answer. Speculative parallelism then represents the assignment of resources to múltiple branches of this graph even if they are not positively known to be on the path to a solution. In highly non-deterministic programs the branching factor can be very high and a naive assignment will very soon use up all the resources. This report presents work assignment strategies other than the usual depth-first and breadth-first. Instead, best-first strategies are used. Since their definition is application-dependent, the application language contains primitives that allow the user (or application programmer) to a) indícate when intelligent OR-parallelism should be used; b) provide the functions that define "best," and c) indícate when to use them. An abstract architecture enables those primitives to perform the search in a "speculative" way, using several processors, synchronizing them, killing the siblings of the path leading to the answer, etc. The user is freed from worrying about these interactions. Several search strategies are proposed and their implementation issues are addressed. "Armageddon," a global pruning method, is introduced, together with both a software and a hardware implementation for it. The concepts exposed are applicable to áreas of Artificial Intelligence such as extensive expert systems, planning, game playing, and in general to large search problems. The proposed strategies, although showing promise, have not been evaluated by simulation or experimentation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Partitioning is a common approach to developing mixed-criticality systems, where partitions are isolated from each other both in the temporal and the spatial domain in order to prevent low-criticality subsystems from compromising other subsystems with high level of criticality in case of misbehaviour. The advent of many-core processors, on the other hand, opens the way to highly parallel systems in which all partitions can be allocated to dedicated processor cores. This trend will simplify processor scheduling, although other issues such as mutual interference in the temporal domain may arise as a consequence of memory and device sharing. The paper describes an architecture for multi-core partitioned systems including critical subsystems built with the Ada Ravenscar profile. Some implementation issues are discussed, and experience on implementing the ORK kernel on the XtratuM partitioning hypervisor is presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This document is the result of a process of web development to create a tool that will allow to Cracow University of Technology consult, create and manage timetables. The technologies chosen for this purpose are Apache Tomcat Server, My SQL Community Server, JDBC driver, Java Servlets and JSPs for the server side. The client part counts on Javascript, jQuery, AJAX and CSS technologies to perform the dynamism. The document will justify the choice of these technologies and will explain some development tools that help in the integration and development of all this elements: specifically, NetBeans IDE and MySQL workbench have been used as helpful tools. After explaining all the elements involved in the development of the web application, the architecture and the code developed are explained through UML diagrams. Some implementation details related to security are also deeper explained through sequence diagrams. As the source code of the application is provided, an installation manual has been developed to run the project. In addition, as the platform is intended to be a beta that will be grown, some unimplemented ideas for future development are also exposed. Finally, some annexes with important files and scripts related to the initiation of the platform are attached. This project started through an existing tool that needed to be expanded. The main purpose of the project along its development has focused on setting the roots for a whole new platform that will replace the existing one. For this goal, it has been needed to make a deep inspection on the existing web technologies: a web server and a SQL database had to be chosen. Although the alternatives were a lot, Java technology for the server was finally selected because of the big community backwards, the easiness of modelling the language through UML diagrams and the fact of being free license software. Apache Tomcat is the open source server that can use Java Servlet and JSP technology. Related to the SQL database, MySQL Community Server is the most popular open-source SQL Server, with a big community after and quite a lot of tools to manage the server. JDBC is the driver needed to put in contact Java and MySQL. Once we chose the technologies that would be part of the platform, the development process started. After a detailed explanation of the development environment installation, we used UML use case diagrams to set the main tasks of the platform; UML class diagrams served to establish the existing relations between the classes generated; the architecture of the platform was represented through UML deployment diagrams; and Enhanced entity–relationship (EER) model were used to define the tables of the database and their relationships. Apart from the previous diagrams, some implementation issues were explained to make a better understanding of the developed code - UML sequence diagrams helped to explain this. Once the whole platform was properly defined and developed, the performance of the application has been shown: it has been proved that with the current state of the code, the platform covers the use cases that were set as the main target. Nevertheless, some requisites needed for the proper working of the platform have been specified. As the project is aimed to be grown, some ideas that could not be added to this beta have been explained in order not to be missed for future development. Finally, some annexes containing important configuration issues for the platform have been added after proper explanation, as well as an installation guide that will let a new developer get the project ready. In addition to this document some other files related to the project are provided: - Javadoc. The Javadoc containing the information of every Java class created is necessary for a better understanding of the source code. - database_model.mwb. This file contains the model of the database for MySQL Workbench. This model allows, among other things, generate the MySQL script for the creation of the tables. - ScheduleManager.war. The WAR file that will allow loading the developed application into Tomcat Server without using NetBeans. - ScheduleManager.zip. The source code exported from NetBeans project containing all Java packages, JSPs, Javascript files and CSS files that are part of the platform. - config.properties. The configuration file to properly get the names and credentials to use the database, also explained in Annex II. Example of config.properties file. - db_init_script.sql. The SQL query to initiate the database explained in Annex III. SQL statements for MySQL initialization. RESUMEN. Este proyecto tiene como punto de partida la necesidad de evolución de una herramienta web existente. El propósito principal del proyecto durante su desarrollo se ha centrado en establecer las bases de una completamente nueva plataforma que reemplazará a la existente. Para lograr esto, ha sido necesario realizar una profunda inspección en las tecnologías web existentes: un servidor web y una base de datos SQL debían ser elegidos. Aunque existen muchas alternativas, la tecnología Java ha resultado ser elegida debido a la gran comunidad de desarrolladores que tiene detrás, además de la facilidad que proporciona este lenguaje a la hora de modelarlo usando diagramas UML. Tampoco hay que olvidar que es una tecnología de uso libre de licencia. Apache Tomcat es el servidor de código libre que permite emplear Java Servlets y JSPs para hacer uso de la tecnología de Java. Respecto a la base de datos SQL, el servidor más popular de código libre es MySQL, y cuenta también con una gran comunidad detrás y buenas herramientas de modelado, creación y gestión de la bases de datos. JDBC es el driver que va a permitir comunicar las aplicaciones Java con MySQL. Tras elegir las tecnologías que formarían parte de esta nueva plataforma, el proceso de desarrollo tiene comienzo. Tras una extensa explicación de la instalación del entorno de desarrollo, se han usado diagramas de caso de UML para establecer cuáles son los objetivos principales de la plataforma; los diagramas de clases nos permiten realizar una organización del código java desarrollado de modo que sean fácilmente entendibles las relaciones entre las diferentes clases. La arquitectura de la plataforma queda definida a través de diagramas de despliegue. Por último, diagramas EER van a definir las relaciones entre las tablas creadas en la base de datos. Aparte de estos diagramas, algunos detalles de implementación se van a justificar para tener una mejor comprensión del código desarrollado. Diagramas de secuencia ayudarán en estas explicaciones. Una vez que toda la plataforma haya quedad debidamente definida y desarrollada, se va a realizar una demostración de la misma: se demostrará cómo los objetivos generales han sido alcanzados con el desarrollo actual del proyecto. No obstante, algunos requisitos han sido aclarados para que la plataforma trabaje adecuadamente. Como la intención del proyecto es crecer (no es una versión final), algunas ideas que se han podido llevar acabo han quedado descritas de manera que no se pierdan. Por último, algunos anexos que contienen información importante acerca de la plataforma se han añadido tras la correspondiente explicación de su utilidad, así como una guía de instalación que va a permitir a un nuevo desarrollador tener el proyecto preparado. Junto a este documento, ficheros conteniendo el proyecto desarrollado quedan adjuntos. Estos ficheros son: - Documentación Javadoc. Contiene la información de las clases Java que han sido creadas. - database_model.mwb. Este fichero contiene el modelo de la base de datos para MySQL Workbench. Esto permite, entre otras cosas, generar el script de iniciación de la base de datos para la creación de las tablas. - ScheduleManager.war. El fichero WAR que permite desplegar la plataforma en un servidor Apache Tomcat. - ScheduleManager.zip. El código fuente exportado directamente del proyecto de Netbeans. Contiene todos los paquetes de Java generados, ficheros JSPs, Javascript y CSS que forman parte de la plataforma. - config.properties. Ejemplo del fichero de configuración que permite obtener los nombres de la base de datos - db_init_script.sql. Las consultas SQL necesarias para la creación de la base de datos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

From the start of 2016, new rules for bank resolution are in place – as spelled out in the Bank Recovery and Resolution Directive (BRRD) – across the EU, and a new authority (the Single Resolution Board, or SRB) is fully operational for resolving all banks in the eurozone. The implementation issues of the new regime are enormous. Banks need to develop recovery plans, and authorities need to create resolution plans as well as set the minimum required amount of own funds and eligible liabilities (MREL) for each bank. But given the diversity in bank structures and instruments at EU and global level, this will be a formidable challenge, above all with respect to internationally active banks. In order to explore ways in which the authorities and banks can meet this challenge, CEPS formed a Task Force composed of senior experts on banking sector reform and chaired by Thomas Huertas, Partner and Chair, EY Global Regulatory Network. This report contains its policy recommendations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The notion of compensation is widely used in advanced transaction models as means of recovery from a failure. Similar concepts are adopted for providing transaction-like behaviour for long business processes supported by workflows technology. In general, it is not trivial to design compensating tasks for tasks in the context of a workflow. Actually, a task in a workflow process does not have to be compensatable in the sense that the forcibility of reverse operations of the task is not always guaranteed by the application semantics. In addition, the isolation requirement on data resources may make a task difficult to compensate. In this paper, we first look into the requirements that a compensating task has to satisfy. Then we introduce a new concept called confirmation. With the help of confirmation, we are able to modify most non-compensatable tasks so that they become compensatable. This can substantially increase the availability of shared resources and greatly improve backward recovery for workflow applications in case of failures. To effectively incorporate confirmation and compensation into a workflow management environment, a three level bottom-up workflow design method is introduced. The implementation issues of this design are also discussed. (C) 2003 Elsevier Science Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications, and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes. We present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods that preserve the underlying structure of the problem. We also present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable-step-size implementations based on various types of control.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mitarai [Phys. Fluids 17, 047101 (2005)] compared turbulent combustion models against homogeneous direct numerical simulations with extinction/recognition phenomena. The recently suggested multiple mapping conditioning (MMC) was not considered and is simulated here for the same case with favorable results. Implementation issues crucial for successful MMC simulations are also discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Collaborative working with the aid of computers is increasing rapidly due to the widespread use of computer networks, geographic mobility of people, and small powerful personal computers. For the past ten years research has been conducted into this use of computing technology from a wide variety of perspectives and for a wide range of uses. This thesis adds to that previous work by examining the area of collaborative writing amongst groups of people. The research brings together a number of disciplines, namely sociology for examining group dynamics, psychology for understanding individual writing and learning processes, and computer science for database, networking, and programming theory. The project initially looks at groups and how they form, communicate, and work together, progressing on to look at writing and the cognitive processes it entails for both composition and retrieval. The thesis then details a set of issues which need to be addressed in a collaborative writing system. These issues are then followed by developing a model for collaborative writing, detailing an iterative process of co-ordination, writing and annotation, consolidation, and negotiation, based on a structured but extensible document model. Implementation issues for a collaborative application are then described, along with various methods of overcoming them. Finally the design and implementation of a collaborative writing system, named Collaborwriter, is described in detail, which concludes with some preliminary results from initial user trials and testing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main purpose of the study is to develop an integrated framework for managing project risks by analyzing risk across project, work package and activity levels, and developing responses. Design/methodology/approach: The study first reviews the literature of various contemporary risk management frameworks in order to identify gaps in project risk management knowledge. Then it develops a conceptual risk management framework using combined analytic hierarchy process (AHP) and risk map for managing project risks. The proposed framework has then been applied to a 1500 km oil pipeline construction project in India in order to demonstrate its effectiveness. The concerned project stakeholders were involved through focus group discussions for applying the proposed risk management framework in the project under study. Findings: The combined AHP and risk map approach is very effective to manage project risks across project, work package and activity levels. The risk factors in project level are caused because of external forces such as business environment (e.g. customers, competitors, technological development, politics, socioeconomic environment). The risk factors in work package and activity levels are operational in nature and created due to internal causes such as lack of material and labor productivity, implementation issues, team ineffectiveness, etc. Practical implications: The suggested model can be applied to any complex project and helps manage risk throughout the project life cycle. Originality/value: Both business and operational risks constitute project risks. In one hand, the conventional project risk management frameworks emphasize on managing business risks and often ignore operational risks. On the other hand, the studies that deal with operational risk often do not link them with business risks. However, they need to be addressed in an integrated way as there are a few risks that affect only the specific level. Hence, this study bridges the gaps. © 2010 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In recent decades, a number of sustainable strategies and polices have been created to protect and preserve our water environments from the impacts of growing communities. The Australian approach, Water Sensitive Urban Design (WSUD), defined as the integration of urban planning and design with the urban water cycle management, has made considerable advances on design guidelines since 2000. WSUD stormwater management systems (e.g. wetlands, bioretentions, porous pavement etc), also known as Best Management Practices (BMPs) or Low Impact Development (LID), are slowly gaining popularity across Australia, the USA and Europe. There have also been significant improvements in how to model the performance of the WSUD technologies (e.g. MUSIC software). However, the implementation issues of these WSUD practices are mainly related to ongoing institutional capacity. Some of the key problems are associated with a limited awareness of urban planners and designers; in general, they have very little knowledge of these systems and their benefits to the urban environments. At the same time, hydrological engineers should have a better understanding of building codes and master plans. The land use regulations are equally as important as the physical site conditions for determining opportunities and constraints for implementing WSUD techniques. There is a need for procedures that can make a better linkage between urban planners and WSUD engineering practices. Thus, this paper aims to present the development of a general framework for incorporating WSUD technologies into the site planning process. The study was applied to lot-scale in the Melbourne region, Australia. Results show the potential space available for fitting WSUD elements, according to building requirements and different types of housing densities. © 2011 WIT Press.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biodiversity offsets are increasingly advocated as a flexible approach to managing the ecological costs of economic development. Arguably, however, this remains an area where policy-making has run ahead of science. A growing number of studies identify limitations of offsets in achieving ecologically sustainable outcomes, pointing to ethical and implementation issues that may undermine their effectiveness. We develop a novel system dynamic modelling framework to analyze the no net loss objective of development and biodiversity offsets. The modelling framework considers a marine-based example, where resource abundance depends on a habitat that is affected by a sequence of development projects, and biodiversity offsets are understood as habitat restoration actions. The model is used to explore the implications of four alternative offset management strategies for a regulator, which differ in how net loss is measured, and whether and how the cumulative impacts of development are considered. Our results confirm that, when it comes to offsets as a conservation tool, the devil lies in the details. Approaches to determining the magnitude of offsets required, as well as their timing and allocation among multiple developers, can result in potentially complex and undesired sets of economic incentives, with direct impacts on the ability to meet the overall objective of ecologically sustainable development. The approach and insights are of direct interest to conservation policy design in a broad range of marine and coastal contexts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The high population density and tightly packed nature of some city centres make emergency planning for these urban spaces especially important, given the potential for human loss in case of disaster. Historic and recent events have made emergency service planners particularly conscious of the need for preparing evacuation plans in advance. This paper discusses a methodological approach for assisting decision-makers in designing urban evacuation plans. The approach aims at quickly and safely moving the population away from the danger zone into shelters. The plans include determining the number and location of rescue facilities, as well as the paths that people should take from their building to their assigned shelter in case of an occurrence requiring evacuation. The approach is thus of the location–allocation–routing type, through the existing streets network, and takes into account the trade-offs among different aspects of evacuation actions that inevitably come up during the planning stage. All the steps of the procedure are discussed and systematised, along with computational and practical implementation issues, in the context of a case study – the design of evacuation plans for the historical centre of an old European city.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Actualmente la relación que existe entre la logística humanitaria y la seguridad agroalimentaria es muy estrecha, estos dos conceptos constituyen un elemento sumamente importante en situaciones que se presentan hoy en día a nivel mundial, como por ejemplo en los diferentes tipos de desastres naturales como terremotos, tsunamis, sequías, hambrunas, entre otros, o simplemente, en zonas de alta pobreza. La logística humanitaria es hoy en día un concepto novedoso que nace de la necesidad de poder suministrar los recursos adecuados en zonas que han sido afectadas por desastres naturales, o situaciones de orden público, como por ejemplo los conflictos armados. Estos recursos deben asegurarle a la población afectada que puedan mantener a lo largo de un tiempo sus necesidades básicas, y así mismo, que estos recursos aseguren a la población afectada que puedan tener una vida digna y segura, lo que conlleva entre otras cosas, tener una seguridad agroalimentaria. A medida que se presentan diferentes escenarios de desastres naturales, la ayuda humanitaria a nivel mundial es necesaria, y el tiempo de respuesta juega un papel sumamente importante. En consecuencia, esta investigación presenta un enfoque que involucra los conceptos de logística humanitaria y de seguridad agroalimentaria, con el fin de contextualizar su evolución y los estudios que se llevan a cabo hoy en día.