942 resultados para Model-driven Web engineering


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Graphs are powerful tools to describe social, technological and biological networks, with nodes representing agents (people, websites, gene, etc.) and edges (or links) representing relations (or interactions) between agents. Examples of real-world networks include social networks, the World Wide Web, collaboration networks, protein networks, etc. Researchers often model these networks as random graphs. In this dissertation, we study a recently introduced social network model, named the Multiplicative Attribute Graph model (MAG), which takes into account the randomness of nodal attributes in the process of link formation (i.e., the probability of a link existing between two nodes depends on their attributes). Kim and Lesckovec, who defined the model, have claimed that this model exhibit some of the properties a real world social network is expected to have. Focusing on a homogeneous version of this model, we investigate the existence of zero-one laws for graph properties, e.g., the absence of isolated nodes, graph connectivity and the emergence of triangles. We obtain conditions on the parameters of the model, so that these properties occur with high or vanishingly probability as the number of nodes becomes unboundedly large. In that regime, we also investigate the property of triadic closure and the nodal degree distribution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The goal of this study is to provide a framework for future researchers to understand and use the FARSITE wildfire-forecasting model with data assimilation. Current wildfire models lack the ability to provide accurate prediction of fire front position faster than real-time. When FARSITE is coupled with a recursive ensemble filter, the data assimilation forecast method improves. The scope includes an explanation of the standalone FARSITE application, technical details on FARSITE integration with a parallel program coupler called OpenPALM, and a model demonstration of the FARSITE-Ensemble Kalman Filter software using the FireFlux I experiment by Craig Clements. The results show that the fire front forecast is improved with the proposed data-driven methodology than with the standalone FARSITE model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Part 6: Engineering and Implementation of Collaborative Networks

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To analyze the characteristics and predict the dynamic behaviors of complex systems over time, comprehensive research to enable the development of systems that can intelligently adapt to the evolving conditions and infer new knowledge with algorithms that are not predesigned is crucially needed. This dissertation research studies the integration of the techniques and methodologies resulted from the fields of pattern recognition, intelligent agents, artificial immune systems, and distributed computing platforms, to create technologies that can more accurately describe and control the dynamics of real-world complex systems. The need for such technologies is emerging in manufacturing, transportation, hazard mitigation, weather and climate prediction, homeland security, and emergency response. Motivated by the ability of mobile agents to dynamically incorporate additional computational and control algorithms into executing applications, mobile agent technology is employed in this research for the adaptive sensing and monitoring in a wireless sensor network. Mobile agents are software components that can travel from one computing platform to another in a network and carry programs and data states that are needed for performing the assigned tasks. To support the generation, migration, communication, and management of mobile monitoring agents, an embeddable mobile agent system (Mobile-C) is integrated with sensor nodes. Mobile monitoring agents visit distributed sensor nodes, read real-time sensor data, and perform anomaly detection using the equipped pattern recognition algorithms. The optimal control of agents is achieved by mimicking the adaptive immune response and the application of multi-objective optimization algorithms. The mobile agent approach provides potential to reduce the communication load and energy consumption in monitoring networks. The major research work of this dissertation project includes: (1) studying effective feature extraction methods for time series measurement data; (2) investigating the impact of the feature extraction methods and dissimilarity measures on the performance of pattern recognition; (3) researching the effects of environmental factors on the performance of pattern recognition; (4) integrating an embeddable mobile agent system with wireless sensor nodes; (5) optimizing agent generation and distribution using artificial immune system concept and multi-objective algorithms; (6) applying mobile agent technology and pattern recognition algorithms for adaptive structural health monitoring and driving cycle pattern recognition; (7) developing a web-based monitoring network to enable the visualization and analysis of real-time sensor data remotely. Techniques and algorithms developed in this dissertation project will contribute to research advances in networked distributed systems operating under changing environments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Inflammatory bowel disease (IBD) is a chronic inflammation which affects the gastrointestinal tract (GIT). One of the best ways to study the immunological mechanisms involved during the disease is the T cell transfer model of colitis. In this model, immunodeficient mice (RAG-/-recipients) are reconstituted with naive CD4+ T cells from healthy wild type hosts. This model allows examination of the earliest immunological events leading to disease and chronic inflammation, when the gut inflammation perpetuates but does not depend on a defined antigen. To study the potential role of antigen presenting cells (APCs) in the disease process, it is helpful to have an antigen-driven disease model, in which a defined commensal-derived antigen leads to colitis. An antigen driven-colitis model has hence been developed. In this model OT-II CD4+ T cells, that can recognize only specific epitopes in the OVA protein, are transferred into RAG-/- hosts challenged with CFP-OVA-expressing E. coli. This model allows the examination of interactions between APCs and T cells in the lamina propria.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Les applications Web en général ont connu d’importantes évolutions technologiques au cours des deux dernières décennies et avec elles les habitudes et les attentes de la génération de femmes et d’hommes dite numérique. Paradoxalement à ces bouleversements technologiques et comportementaux, les logiciels d’enseignement et d’apprentissage (LEA) n’ont pas tout à fait suivi la même courbe d’évolution technologique. En effet, leur modèle de conception est demeuré si statique que leur utilité pédagogique est remise en cause par les experts en pédagogie selon lesquels les LEA actuels ne tiennent pas suffisamment compte des aspects théoriques pédagogiques. Mais comment améliorer la prise en compte de ces aspects dans le processus de conception des LEA? Plusieurs approches permettent de concevoir des LEA robustes. Cependant, un intérêt particulier existe pour l’utilisation du concept patron dans ce processus de conception tant par les experts en pédagogie que par les experts en génie logiciel. En effet, ce concept permet de capitaliser l’expérience des experts et permet aussi de simplifier de belle manière le processus de conception et de ce fait son coût. Une comparaison des travaux utilisant des patrons pour concevoir des LEA a montré qu’il n’existe pas de cadre de synergie entre les différents acteurs de l’équipe de conception, les experts en pédagogie d’un côté et les experts en génie logiciel de l’autre. De plus, les cycles de vie proposés dans ces travaux ne sont pas complets, ni rigoureusement décrits afin de permettre de développer des LEA efficients. Enfin, les travaux comparés ne montrent pas comment faire coexister les exigences pédagogiques avec les exigences logicielles. Le concept patron peut-il aider à construire des LEA robustes satisfaisant aux exigences pédagogiques ? Comme solution, cette thèse propose une approche de conception basée sur des patrons pour concevoir des LEA adaptés aux technologies du Web. Plus spécifiquement, l’approche méthodique proposée montre quelles doivent être les étapes séquentielles à prévoir pour concevoir un LEA répondant aux exigences pédagogiques. De plus, un répertoire est présenté et contient 110 patrons recensés et organisés en paquetages. Ces patrons peuvent être facilement retrouvés à l’aide du guide de recherche décrit pour être utilisés dans le processus de conception. L’approche de conception a été validée avec deux exemples d’application, permettant de conclure d’une part que l’approche de conception des LEA est réaliste et d’autre part que les patrons sont bien valides et fonctionnels. L’approche de conception de LEA proposée est originale et se démarque de celles que l’on trouve dans la littérature car elle est entièrement basée sur le concept patron. L’approche permet également de prendre en compte les exigences pédagogiques. Elle est générique car indépendante de toute plateforme logicielle ou matérielle. Toutefois, le processus de traduction des exigences pédagogiques n’est pas encore très intuitif, ni très linéaire. D’autres travaux doivent être réalisés pour compléter les résultats obtenus afin de pouvoir traduire en artéfacts exploitables par les ingénieurs logiciels les exigences pédagogiques les plus complexes et les plus abstraites. Pour la suite de cette thèse, une instanciation des patrons proposés serait intéressante ainsi que la définition d’un métamodèle basé sur des patrons qui pourrait permettre la spécification d’un langage de modélisation typique des LEA. L’ajout de patrons permettant d’ajouter une couche sémantique au niveau des LEA pourrait être envisagée. Cette couche sémantique permettra non seulement d’adapter les scénarios pédagogiques, mais aussi d’automatiser le processus d’adaptation au besoin d’un apprenant en particulier. Il peut être aussi envisagé la transformation des patrons proposés en ontologies pouvant permettre de faciliter l’évaluation des connaissances de l’apprenant, de lui communiquer des informations structurées et utiles pour son apprentissage et correspondant à son besoin d’apprentissage.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Idiopathic pulmonary fibrosis (IPF) is a chronic progressive disease with no curative pharmacological treatment. Animal models play an essential role in revealing molecular mechanisms involved in the pathogenesis of the disease. Bleomycin (BLM)-induced lung fibrosis is the most widely used and characterized model for anti-fibrotic drugs screening. However, several issues have been reported, such as the identification of an optimal BLM dose and administration scheme as well as gender-specificity. Moreover, the balance between disease resolution, an appropriate time window for therapeutic intervention and animal welfare remains critical aspects yet to be fully elucidated. In this thesis, Micro CT imaging has been used as a tool to identify the ideal BLM dose regimen to induce sustained lung fibrosis in mice as well as to assess the anti-fibrotic effect of Nintedanib (NINT) treatment upon this BLM administration regimen. In order to select the optimal BLM dose scheme, C57bl/6 male mice were treated with BLM via oropharyngeal aspiration (OA), following either double or triple BLM administration. The triple BLM administration resulted in the most promising scheme, able to balance disease resolution, appropriate time-window for therapeutic intervention and animal welfare. The fibrosis progression was longitudinally assessed by micro-CT every 7 days for 5 weeks after BLM administration and 5 animals were sacrificed at each timepoint for the BALF and histological evaluation. The antifibrotic effect of NINT was assessed following different treatment regimens in this model. Herein, we have developed an optimized mouse model of pulmonary fibrosis, enabling three weeks of the therapeutic window to screen putative anti-fibrotic drugs. micro-CT scanning, allowed us to monitor the progression of lung fibrosis and the therapeutical response longitudinally in the same subject, drastically reducing the number of animals involved in the experiment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Imaging technologies are widely used in application fields such as natural sciences, engineering, medicine, and life sciences. A broad class of imaging problems reduces to solve ill-posed inverse problems (IPs). Traditional strategies to solve these ill-posed IPs rely on variational regularization methods, which are based on minimization of suitable energies, and make use of knowledge about the image formation model (forward operator) and prior knowledge on the solution, but lack in incorporating knowledge directly from data. On the other hand, the more recent learned approaches can easily learn the intricate statistics of images depending on a large set of data, but do not have a systematic method for incorporating prior knowledge about the image formation model. The main purpose of this thesis is to discuss data-driven image reconstruction methods which combine the benefits of these two different reconstruction strategies for the solution of highly nonlinear ill-posed inverse problems. Mathematical formulation and numerical approaches for image IPs, including linear as well as strongly nonlinear problems are described. More specifically we address the Electrical impedance Tomography (EIT) reconstruction problem by unrolling the regularized Gauss-Newton method and integrating the regularization learned by a data-adaptive neural network. Furthermore we investigate the solution of non-linear ill-posed IPs introducing a deep-PnP framework that integrates the graph convolutional denoiser into the proximal Gauss-Newton method with a practical application to the EIT, a recently introduced promising imaging technique. Efficient algorithms are then applied to the solution of the limited electrods problem in EIT, combining compressive sensing techniques and deep learning strategies. Finally, a transformer-based neural network architecture is adapted to restore the noisy solution of the Computed Tomography problem recovered using the filtered back-projection method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The alkali-aggregate reaction (AAR) is a chemical reaction that provokes a heterogeneous expansion of concrete and reduces important properties such as Young's modulus, leading to a reduction in the structure's useful life. In this study, a parametric model is employed to determine the spatial distribution of the concrete expansion, combining normalized factors that influence the reaction through an AAR expansion law. Optimization techniques were employed to adjust the numerical results and observations in a real structure. A three-dimensional version of the model has been implemented in a finite element commercial package (ANSYS(C)) and verified in the analysis of an accelerated mortar test. Comparisons were made between two AAR mathematical descriptions for the mechanical phenomenon, using the same methodology, and an expansion curve obtained from experiment. Some parametric studies are also presented. The numerical results compared very well with the experimental data validating the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implementation of confidential contracts between a container liner carrier and its customers, because of the Ocean Shipping Reform Act (OSRA) 1998, demands a revision in the methodology applied in the carrier's planning of marketing and sales. The marketing and sales planning process should be more scientific and with a better use of operational research tools considering the selection of the customers under contracts, the duration of the contracts, the freight, and the container imbalances of these contracts are basic factors for the carrier's yield. This work aims to develop a decision support system based on a linear programming model to generate the business plan for a container liner carrier, maximizing the contribution margin of its freight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compartmental epidemiological models have been developed since the 1920s and successfully applied to study the propagation of infectious diseases. Besides, due to their structure, in the 1960s an interesting version of these models was developed to clarify some aspects of rumor propagation, considering that spreading an infectious disease or disseminating information is analogous phenomena. Here, in an analogy with the SIR (Susceptible-Infected-Removed) epidemiological model, the ISS (Ignorant-Spreader-Stifler) rumor spreading model is studied. By using concepts from the Dynamical Systems Theory, stability of equilibrium points is established, according to propagation parameters and initial conditions. Some numerical experiments are conducted in order to validate the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study proposes a simplified mathematical model to describe the processes occurring in an anaerobic sequencing batch biofilm reactor (ASBBR) treating lipid-rich wastewater. The reactor, subjected to rising organic loading rates, contained biomass immobilized cubic polyurethane foam matrices, and was operated at 32 degrees C +/- 2 degrees C, using 24-h batch cycles. In the adaptation period, the reactor was fed with synthetic substrate for 46 days and was operated without agitation. Whereas agitation was raised to 500 rpm, the organic loading rate (OLR) rose from 0.3 g chemical oxygen demand (COD) . L(-1) . day(-1) to 1.2 g COD . L(-1) . day(-1). The ASBBR was fed fat-rich wastewater (dairy wastewater), in an operation period lasting for 116 days, during which four operational conditions (OCs) were tested: 1.1 +/- 0.2 g COD . L(-1) . day(-1) (OC1), 4.5 +/- 0.4 g COD . L(-1) . day(-1) (OC2), 8.0 +/- 0.8 g COD . L(-1) . day(-1) (OC3), and 12.1 +/- 2.4 g COD . L(-1) . day(-1) (OC4). The bicarbonate alkalinity (BA)/COD supplementation ratio was 1:1 at OC1, 1:2 at OC2, and 1:3 at OC3 and OC4. Total COD removal efficiencies were higher than 90%, with a constant production of bicarbonate alkalinity, in all OCs tested. After the process reached stability, temporal profiles of substrate consumption were obtained. Based on these experimental data a simplified first-order model was fit, making possible the inference of kinetic parameters. A simplified mathematical model correlating soluble COD with volatile fatty acids (VFA) was also proposed, and through it the consumption rates of intermediate products as propionic and acetic acid were inferred. Results showed that the microbial consortium worked properly and high efficiencies were obtained, even with high initial substrate concentrations, which led to the accumulation of intermediate metabolites and caused low specific consumption rates.