988 resultados para Design Event


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any other technology has never affected daily life at this level and witnessed as speedy adaptation as the mobile phone. At the same time, mobile media has developed to be a serious marketing tool for all kinds of businesses, and the industry has grown explosively in recent years. The objective of this thesis is to inspect the mobile marketing process of an international event. This thesis is a qualitative case study. The chosen case for this thesis is the mobile marketing process of Falun2015 FIS Nordic World Ski Championships due to researcher’s interest on the topic and contacts to the people around the event. The empirical findings were acquired by conducting two interviews with three experts from the case organisation and its partner organisation. The interviews were performed as semi-structured interviews utilising the themes arising from the chosen theoretical framework. The framework distinguished six phases in the process: (i) campaign initiation, (ii) campaign design, (iii) campaign creation, (iv) permission management, (v) delivery, and (vi) evaluation and analysis. Phases one and five were not examined in this thesis because campaign initiation was not purely seen as part of the campaign implementation, and investigating phase five would have required a very technical viewpoint to the study. In addition to the interviews, some pre-established documents were exploited as a supporting data. The empirical findings of this thesis mainly follow the theoretical framework utilised. However, some modifications to the model could be made mainly related to the order of different phases. In the revised model, the actions are categorised depending on the time they should be conducted, i.e. before, during or after the event. Regardless of the categorisation, the phases can be in different order and overlapping. In addition, the business network was highly emphasised by the empirical findings and is thus added to the modified model. Five managerial recommendations can be concluded from the empirical findings of this thesis: (i) the importance of a business network should be highly valued in a mobile marketing process; (ii) clear goals should be defined for mobile marketing actions in order to make sure that everyone involved is aware them; (iii) interactivity should be perceived as part of a mobile marketing communication; (iv) enough time should be allowed for the development of a mobile marketing process in order to exploit all the potential it can offer; and (v) attention should be paid to measuring and analysing matters that are of relevance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the eye-catching advances in sensing technologies, smart water networks have been attracting immense research interest in recent years. One of the most overarching tasks in smart water network management is the reduction of water loss (such as leaks and bursts in a pipe network). In this paper, we propose an efficient scheme to position water loss event based on water network topology. The state-of-the-art approach to this problem, however, utilizes the limited topology information of the water network, that is, only one single shortest path between two sensor locations. Consequently, the accuracy of positioning water loss events is still less desirable. To resolve this problem, our scheme consists of two key ingredients: First, we design a novel graph topology-based measure, which can recursively quantify the "average distances" for all pairs of senor locations simultaneously in a water network. This measure will substantially improve the accuracy of our positioning strategy, by capturing the entire water network topology information between every two sensor locations, yet without any sacrifice of computational efficiency. Then, we devise an efficient search algorithm that combines the "average distances" with the difference in the arrival times of the pressure variations detected at sensor locations. The viable experimental evaluations on real-world test bed (WaterWiSe@SG) demonstrate that our proposed positioning scheme can identify water loss event more accurately than the best-known competitor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Literature emphasises the sparse research focused in collaborative and open approaches in the design conceptualisation stage, also known as the Fuzzy Front-End (FFE). Presently, the most challenging discussion arising from this specific field of research lies in understanding on whether or not to structure the referred conceptual stage. Accordingly, the established hypothesis behind this study sustains that a structured approach in the FFE would benefit the interdisciplinary dialogue. Therefore, two objectives support this study: to understand the benefits of an interdisciplinary approach in the FFE, and to test one proposed model for this conceptual stage. By means of a small-scale design experiment, this paper pretends to give additional contributions to this area of research, in the context of new product development (NPD). The general research supporting this specific study aims to conceptualise in the area of newly and futuristic aircraft configurations. Hence, this same topic based the conceptualisation process in the conducted ideation sessions, which are conducted by five different teams of three elements each. The results of the different ideation sessions reinforce the contemporary paradigm of Open Innovation (OI), which is based in trust and communication to better collaborate. The postulated hypothesis for this study is partially validated as teams testing the proposed and structured model generally consider that its usage would benefit the integration of different disciplines. Besides, a general feeling that a structured approach integrates different perspectives and gives creativity a focus pervades. Nevertheless, the small-scale of the design experiment attributes some limitations to this study, despite giving new insights in how to better organise coming and more sustained studies. Interestingly, the importance of sketching as an interdisciplinary means of communication is underlined with the obtained results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the near future, the LHC experiments will continue to be upgraded as the LHC luminosity will increase from the design 1034 to 7.5 × 1034, with the HL-LHC project, to reach 3000 × f b−1 of accumulated statistics. After the end of a period of data collection, CERN will face a long shutdown to improve overall performance by upgrading the experiments and implementing more advanced technologies and infrastructures. In particular, ATLAS will upgrade parts of the detector, the trigger, and the data acquisition system. It will also implement new strategies and algorithms for processing and transferring the data to the final storage. This PhD thesis presents a study of a new pattern recognition algorithm to be used in the trigger system, which is a software designed to provide the information necessary to select physical events from background data. The idea is to use the well-known Hough Transform mathematical formula as an algorithm for detecting particle trajectories. The effectiveness of the algorithm has already been validated in the past, independently of particle physics applications, to detect generic shapes in images. Here, a software emulation tool is proposed for the hardware implementation of the Hough Transform, to reconstruct the tracks in the ATLAS Trigger and Data Acquisition system. Until now, it has never been implemented on electronics in particle physics experiments, and as a hardware implementation it would provide overall latency benefits. A comparison between the simulated data and the physical system was performed on a Xilinx UltraScale+ FPGA device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the metal industry, and more specifically in the forging one, scrap material is a crucial issue and reducing it would be an important goal to reach. Not only would this help the companies to be more environmentally friendly and more sustainable, but it also would reduce the use of energy and lower costs. At the same time, the techniques for Industry 4.0 and the advancements in Artificial Intelligence (AI), especially in the field of Deep Reinforcement Learning (DRL), may have an important role in helping to achieve this objective. This document presents the thesis work, a contribution to the SmartForge project, that was performed during a semester abroad at Karlstad University (Sweden). This project aims at solving the aforementioned problem with a business case of the company Bharat Forge Kilsta, located in Karlskoga (Sweden). The thesis work includes the design and later development of an event-driven architecture with microservices, to support the processing of data coming from sensors set up in the company's industrial plant, and eventually the implementation of an algorithm with DRL techniques to control the electrical power to use in it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hybrid bioisoster derivatives from N-acylhydrazones and furoxan groups were designed with the objective of obtaining at least a dual mechanism of action: cruzain inhibition and nitric oxide (NO) releasing activity. Fifteen designed compounds were synthesized varying the substitution in N-acylhydrazone and in furoxan group as well. They had its anti-Trypanosoma cruzi activity in amastigotes forms, NO releasing potential and inhibitory cruzain activity evaluated. The two most active compounds (6, 14) both in the parasite amastigotes and in the enzyme contain the nitro group in para position of the aromatic ring. The permeability screening in Caco-2 cell and cytotoxicity assay in human cells were performed for those most active compounds and both showed to be less cytotoxic than the reference drug, benznidazole. Compound 6 was the most promising, since besides activity it showed good permeability and selectivity index, higher than the reference drug. Thereby the compound 6 was considered as a possible candidate for additional studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Split-plot design (SPD) and near-infrared chemical imaging were used to study the homogeneity of the drug paracetamol loaded in films and prepared from mixtures of the biocompatible polymers hydroxypropyl methylcellulose, polyvinylpyrrolidone, and polyethyleneglycol. The study was split into two parts: a partial least-squares (PLS) model was developed for a pixel-to-pixel quantification of the drug loaded into films. Afterwards, a SPD was developed to study the influence of the polymeric composition of films and the two process conditions related to their preparation (percentage of the drug in the formulations and curing temperature) on the homogeneity of the drug dispersed in the polymeric matrix. Chemical images of each formulation of the SPD were obtained by pixel-to-pixel predictions of the drug using the PLS model of the first part, and macropixel analyses were performed for each image to obtain the y-responses (homogeneity parameter). The design was modeled using PLS regression, allowing only the most relevant factors to remain in the final model. The interpretation of the SPD was enhanced by utilizing the orthogonal PLS algorithm, where the y-orthogonal variations in the design were separated from the y-correlated variation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Brazil, the consumption of extra-virgin olive oil (EVOO) is increasing annually, but there are no experimental studies concerning the phenolic compound contents of commercial EVOO. The aim of this work was to optimise the separation of 17 phenolic compounds already detected in EVOO. A Doehlert matrix experimental design was used, evaluating the effects of pH and electrolyte concentration. Resolution, runtime and migration time relative standard deviation values were evaluated. Derringer's desirability function was used to simultaneously optimise all 37 responses. The 17 peaks were separated in 19min using a fused-silica capillary (50μm internal diameter, 72cm of effective length) with an extended light path and 101.3mmolL(-1) of boric acid electrolyte (pH 9.15, 30kV). The method was validated and applied to 15 EVOO samples found in Brazilian supermarkets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Herein we describe the synthesis of a focused library of compounds based on the structure of goniothalamin (1) and the evaluation of the potential antitumor activity of the compounds. N-Acylation of aza-goniothalamin (2) restored the in vitro antiproliferative activity of this family of compounds. 1-(E)-But-2-enoyl-6-styryl-5,6-dihydropyridin-2(1H)-one (18) displayed enhanced antiproliferative activity. Both goniothalamin (1) and derivative 18 led to reactive oxygen species generation in PC-3 cells, which was probably a signal for caspase-dependent apoptosis. Treatment with derivative 18 promoted Annexin V/7-aminoactinomycin D double staining, which indicated apoptosis, and also led to G2 /M cell-cycle arrest. In vivo studies in Ehrlich ascitic and solid tumor models confirmed the antitumor activity of goniothalamin (1), without signs of toxicity. However, derivative 18 exhibited an unexpectedly lower in vivo antitumor activity, despite the treatments being administered at the same site of inoculation. Contrary to its in vitro profile, aza-goniothalamin (2) inhibited Ehrlich tumor growth, both on the ascitic and solid forms. Our findings highlight the importance of in vivo studies in the search for new candidates for cancer treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyclosporine, a drug used in immunosuppression protocols for hematopoietic stem cell transplantation that has a narrow therapeutic index, may cause various adverse reactions, including nephrotoxicity. This has a direct clinical impact on the patient. This study aims to summarize available evidence in the scientific literature on the use of cyclosporine in respect to its risk factor for the development of nephrotoxicity in patients submitted to hematopoietic stem cell transplantation. A systematic review was made with the following electronic databases: PubMed, Web of Science, Embase, Scopus, CINAHL, LILACS, SciELO and Cochrane BVS. The keywords used were: bone marrow transplantation OR stem cell transplantation OR grafting, bone marrow AND cyclosporine OR cyclosporin OR risk factors AND acute kidney injury OR acute kidney injuries OR acute renal failure OR acute renal failures OR nephrotoxicity. The level of scientific evidence of the studies was classified according to the Oxford Centre for Evidence Based Medicine. The final sample was composed of 19 studies, most of which (89.5%) had an observational design, evidence level 2B and pointed to an incidence of nephrotoxicity above 30%. The available evidence, considered as good quality and appropriate for the analyzed event, indicates that cyclosporine represents a risk factor for the occurrence of nephrotoxicity, particularly when combined with amphotericin B or aminoglycosides, agents commonly used in hematopoietic stem cell transplantation recipients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An HPLC-PAD method using a gold working electrode and a triple-potential waveform was developed for the simultaneous determination of streptomycin and dihydrostreptomycin in veterinary drugs. Glucose was used as the internal standard, and the triple-potential waveform was optimized using a factorial and a central composite design. The optimum potentials were as follows: amperometric detection, E1=-0.15V; cleaning potential, E2=+0.85V; and reactivation of the electrode surface, E3=-0.65V. For the separation of the aminoglycosides and the internal standard of glucose, a CarboPac™ PA1 anion exchange column was used together with a mobile phase consisting of a 0.070 mol L(-1) sodium hydroxide solution in the isocratic elution mode with a flow rate of 0.8 mL min(-1). The method was validated and applied to the determination of streptomycin and dihydrostreptomycin in veterinary formulations (injection, suspension and ointment) without any previous sample pretreatment, except for the ointments, for which a liquid-liquid extraction was required before HPLC-PAD analysis. The method showed adequate selectivity, with an accuracy of 98-107% and a precision of less than 3.9%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maxillofacial trauma resulting from falls in elderly patients is a major social and health care concern. Most of these traumatic events involve mandibular fractures. The aim of this study was to analyze stress distributions from traumatic loads applied on the symphyseal, parasymphyseal, and mandibular body regions in the elderly edentulous mandible using finite-element analysis (FEA). Computerized tomographic analysis of an edentulous macerated human mandible of a patient approximately 65 years old was performed. The bone structure was converted into a 3-dimensional stereolithographic model, which was used to construct the computer-aided design (CAD) geometry for FEA. The mechanical properties of cortical and cancellous bone were characterized as isotropic and elastic structures, respectively, in the CAD model. The condyles were constrained to prevent free movement in the x-, y-, and z-axes during simulation. This enabled the simulation to include the presence of masticatory muscles during trauma. Three different simulations were performed. Loads of 700 N were applied perpendicular to the surface of the cortical bone in the symphyseal, parasymphyseal, and mandibular body regions. The simulation results were evaluated according to equivalent von Mises stress distributions. Traumatic load at the symphyseal region generated low stress levels in the mental region and high stress levels in the mandibular neck. Traumatic load at the parasymphyseal region concentrated the resulting stress close to the mental foramen. Traumatic load in the mandibular body generated extensive stress in the mandibular body, angle, and ramus. FEA enabled precise mapping of the stress distribution in a human elderly edentulous mandible (neck and mandibular angle) in response to 3 different traumatic load conditions. This knowledge can help guide emergency responders as they evaluate patients after a traumatic event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For centuries, specific instruments or regular toothbrushes have routinely been used to remove tongue biofilm and improve breath odor. Toothbrushes with a tongue scraper on the back of their head have recently been introduced to the market. The present study compared the effectiveness of a manual toothbrush with this new design, i.e., possessing a tongue scraper, and a commercial tongue scraper in improving breath odor and reducing the aerobic and anaerobic microbiota of tongue surface. The evaluations occurred at 4 moments, when the participants (n=30) had their halitosis quantified with a halimeter and scored according to a 4-point scoring system corresponding to different levels of intensity. Saliva was collected for counts of aerobic and anaerobic microorganisms. Data were analyzed statistically by Friedman's test (p<0.05). When differences were detected, the Wilcoxon test adjusted for Bonferroni correction was used for multiple comparisons (group to group). The results confirmed the importance of mechanical cleaning of the tongue, since this procedure provided an improvement in halitosis and reduction of aerobe and anaerobe counts. Regarding the evaluated methods, the toothbrush's tongue scraper and conventional tongue scraper had a similar performance in terms of breath improvement and reduction of tongue microbiota, and may be indicated as effective methods for tongue cleaning.