867 resultados para Web based applications
Resumo:
There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.
Resumo:
In human society, people encounter various deontic conflicts every day. Deontic decisions are those that include moral, ethical, and normative aspects. Here, the concern is with deontic conflicts: decisions where all the alternatives lead to the violation of some norms. People think critically about these kinds of decisions. But, just ‘what’ they think about is not always clear. ^ People use certain estimating factors/criteria to balance the tradeoffs when they encounter deontic conflicts. It is unclear what subjective factors people use to make a deontic decision. An elicitation approach called the Open Factor Conjoint System is proposed, which applies an online elicitation methodology which is a combination of two well-know research methodologies: repertory grid and conjoint analysis. This new methodology is extended to be a web based application. It seeks to elicit additional relevant (subjective) factors from people, which affect deontic decisions. The relative importance and utility values are used for the development of a decision model to predict people’s decisions. ^ Fundamentally, this methodology was developed and intended to be applicable for a wide range of elicitation applications with minimal experimenter bias. Comparing with the traditional method, this online survey method reduces the limitation of time and space in data collection and this methodology can be applied in many fields. Two possible applications were addressed: robotic vehicles and the choice of medical treatment. In addition, this method can be applied to many research related disciplines in cross-cultural research due to its online ability with global capacity. ^
Resumo:
In human society, people encounter various deontic conflicts every day. Deontic decisions are those that include moral, ethical, and normative aspects. Here, the concern is with deontic conflicts: decisions where all the alternatives lead to the violation of some norms. People think critically about these kinds of decisions. But, just ‘what’ they think about is not always clear. People use certain estimating factors/criteria to balance the tradeoffs when they encounter deontic conflicts. It is unclear what subjective factors people use to make a deontic decision. An elicitation approach called the Open Factor Conjoint System is proposed, which applies an online elicitation methodology which is a combination of two well-know research methodologies: repertory grid and conjoint analysis. This new methodology is extended to be a web based application. It seeks to elicit additional relevant (subjective) factors from people, which affect deontic decisions. The relative importance and utility values are used for the development of a decision model to predict people’s decisions. Fundamentally, this methodology was developed and intended to be applicable for a wide range of elicitation applications with minimal experimenter bias. Comparing with the traditional method, this online survey method reduces the limitation of time and space in data collection and this methodology can be applied in many fields. Two possible applications were addressed: robotic vehicles and the choice of medical treatment. In addition, this method can be applied to many research related disciplines in cross-cultural research due to its online ability with global capacity.
Resumo:
The software product line engineering brings advantages when compared with the traditional software development regarding the mass customization of the system components. However, there are scenarios that to maintain separated clones of a software system seems to be an easier and more flexible approach to manage their variabilities of a software product line. This dissertation evaluates qualitatively an approach that aims to support the reconciliation of functionalities between cloned systems. The analyzed approach is based on mining data about the issues and source code of evolved cloned web systems. The next step is to process the merge conflicts collected by the approach and not indicated by traditional control version systems to identify potential integration problems from the cloned software systems. The results of the study show the feasibility of the approach to perform a systematic characterization and analysis of merge conflicts for large-scale web-based systems.
Resumo:
Product quality planning is a fundamental part of quality assurance in manufacturing. It is composed of the distribution of quality aims over each phase in product development and the deployment of quality operations and resources to accomplish these aims. This paper proposes a quality planning methodology based on risk assessment and the planning tasks of product development are translated into evaluation of risk priorities. Firstly, a comprehensive model for quality planning is developed to address the deficiencies of traditional quality function deployment (QFD) based quality planning. Secondly, a novel failure knowledge base (FKB) based method is discussed. Then a mathematical method and algorithm of risk assessment is presented for target decomposition, measure selection, and sequence optimization. Finally, the proposed methodology has been implemented in a web based prototype software system, QQ-Planning, to solve the problem of quality planning regarding the distribution of quality targets and the deployment of quality resources, in such a way that the product requirements are satisfied and the enterprise resources are highly utilized. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.
Resumo:
Several studies in the past have revealed that network end user devices are left powered up 24/7 even when idle just for the sake of maintaining Internet connectivity. Network devices normally support low power states but are kept inactive due to their inability to maintain network connectivity. The Network Connectivity Proxy (NCP) has recently been proposed as an effective mechanism to impersonate network connectivity on behalf of high power devices and enable them to sleep when idle without losing network presence. The NCP can efficiently proxy basic networking protocol, however, proxying of Internet based applications have no absolute solution due to dynamic and non-predictable nature of the packets they are sending and receiving periodically. This paper proposes an approach for proxying Internet based applications and presents the basic software architectures and capabilities. Further, this paper also practically evaluates the proposed framework and analyzes expected energy savings achievable under-different realistic conditions.
Resumo:
[EN]Vision-based applications designed for humanmachine interaction require fast and accurate hand detection. However, previous works on this field assume different constraints, like a limitation in the number of detected gestures, because hands are highly complex objects to locate. This paper presents an approach which changes the detection target without limiting the number of detected gestures. Using a cascade classifier we detect hands based on their wrists. With this approach, we introduce two main contributions: (1) a reliable segmentation, independently of the gesture being made and (2) a training phase faster than previous cascade classifier based methods. The paper includes experimental evaluations with different video streams that illustrate the efficiency and suitability for perceptual interfaces.
Resumo:
Lately, various programming frameworks has been developed for developing web applications. These frameworks focus on increasing the user experience by performance improvements such as faster render times and response times. One of these frameworks are React, which has introduced a completely new architectural pattern for both managing the state and data flow of an application. React also offers support for native application development and makes server-side rendering possible. Something that is difficult to accomplish with an application developed with Angular 1.5, which is used by the company Dewire today. The aim of this thesis was to compare React with an existing Angular project, in order to determine whether React could be a potential replacement for Angular. To gain knowledge about the subject, a theoretical study of web- based sources has been made. While the practical part has been to rebuild a web application with React together with the architecture Flux, which is based on a view from the Angular project. The implementation process was repeated until the view was completed and a desired data flow, as in the Angular application, was reached. The resulting React application was later compared with the Angular application developed by the company, where the outcome of the comparison showed that the React performed better than Angular in all tests. In conclusion, due to the timeframe of the project, only the most important parts of the Angular project were implemented in order to carry out the measurements that were of interest to the company. By recreating most of the functionality, or the entire Angular application, more interesting comparisons could have been done.
Resumo:
This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)
Resumo:
In many areas of simulation, a crucial component for efficient numerical computations is the use of solution-driven adaptive features: locally adapted meshing or re-meshing; dynamically changing computational tasks. The full advantages of high performance computing (HPC) technology will thus only be able to be exploited when efficient parallel adaptive solvers can be realised. The resulting requirement for HPC software is for dynamic load balancing, which for many mesh-based applications means dynamic mesh re-partitioning. The DRAMA project has been initiated to address this issue, with a particular focus being the requirements of industrial Finite Element codes, but codes using Finite Volume formulations will also be able to make use of the project results.
Resumo:
Graphene has emerged as an extraordinary material with its capability to accommodate an array of remarkable electronic, mechanical and chemical properties. Extra-large surface-to-volume ratio renders graphene a highly flexible morphology, giving rise to intriguing observations such as ripples, wrinkles and folds as well as the potential to transform into other novel carbon nanostructures. Ultra-thin, mechanically tough, electrically conductive graphene films promise to enable a wealth of possible applications ranging from hydrogen storage scaffolds, electronic transistors, to bottom-up material designs. Enthusiasm for graphene-based applications aside, there are still significant challenges to their realization, largely due to the difficulty of precisely controlling the graphene properties. Controlling the graphene morphology over large areas is crucial in enabling future graphene-based applications and material design. This dissertation aims to shed lights on potential mechanisms to actively manipulate the graphene morphology and properties and therefore enable the material design principle that delivers desirable mechanical and electronic functionalities of graphene and its derivatives.
Resumo:
This paper reports some experiments in using SVG (Scalable Vector Graphics), rather than the browser default of (X)HTML/CSS, as a potential Web-based rendering technology, in an attempt to create an approach that integrates the structural and display aspects of a Web document in a single XML-compliant envelope. Although the syntax of SVG is XML based, the semantics of the primitive graphic operations more closely resemble those of page description languages such as PostScript or PDF. The principal usage of SVG, so far, is for inserting complex graphic material into Web pages that are predominantly controlled via (X)HTML and CSS. The conversion of structured and unstructured PDF into SVG is discussed. It is found that unstructured PDF converts into pages of SVG with few problems, but difficulties arise when one attempts to map the structural components of a Tagged PDF into an XML skeleton underlying the corresponding SVG. These difficulties are not fundamentally syntactic; they arise largely because browsers are innately bound to (X)HTML/CSS as their default rendering model. Some suggestions are made for ways in which SVG could be more totally integrated into browser functionality, with the possibility that future browsers might be able to use SVG as their default rendering paradigm.
Resumo:
The surge of interest in graphene, as epitomized by the Nobel Prize in Physics in 2010, is attributed to its extraordinary properties. Graphene is ultrathin, mechanically tough, and has amendable surface chemistry. These features make graphene and graphene based nanostructure an ideal candidate for the use of molecular mass manipulation. The controllable and programmable molecular mass manipulation is crucial in enabling future graphene based applications, however is challenging to achieve. This dissertation studies several aspects in molecular mass manipulation including mass transportation, patterning and storage. For molecular mass transportation, two methods based on carbon nanoscroll are demonstrated to be effective. They are torsional buckling instability assisted transportation and surface energy induced radial shrinkage. To achieve a more controllable transportation, a fundamental law of direction transport of molecular mass by straining basal graphene is studied. For molecular mass patterning, we reveal a barrier effect of line defects in graphene, which can enable molecular confining and patterning in a domain of desirable geometry. Such a strategy makes controllable patterning feasible for various types of molecules. For molecular mass storage, we propose a novel partially hydrogenated bilayer graphene structure which has large capacity for mass uptake. Also the mass release can be achieved by simply stretching the structure. Therefore the mass uptake and release is reversible. This kind of structure is crucial in enabling hydrogen fuel based technology. Lastly, spontaneous nanofluidic channel formation enabled by patterned hydrogenation is studied. This novel strategy enables programmable channel formation with pre-defined complex geometry.
Resumo:
International audience