877 resultados para Analysis tools
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
Resumo:
Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.
Resumo:
Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.
Resumo:
Veterinary medicines (VMs) from agricultural industry can enter the environment in a number of ways. This includes direct exposure through aquaculture, accidental spillage and disposal, and indirect entry by leaching from manure or runoff after treatment. Many compounds used in animal treatments have ecotoxic properties that may have chronic or sometimes lethal effects when they come into contact with non-target organisms. VMs enter the environment in mixtures, potentially having additive effects. Traditional ecotoxicology tests are used to determine the lethal and sometimes reproductive effects on freshwater and terrestrial organisms. However, organisms used in ecotoxicology tests can be unrepresentative of the populations that are likely to be exposed to the compound in the environment. Most often the tests are on single compound toxicity but mixture effects may be significant and should be included in ecotoxicology testing. This work investigates the use, measured environmental concentrations (MECs) and potential impact of sea lice treatments on salmon farms in Scotland. Alternative methods for ecotoxicology testing including mixture toxicity, and the use of in silico techniques to predict the chronic impact of VMs on different species of aquatic organisms were also investigated. The Scottish Environmental Protection Agency (SEPA) provided information on the use of five sea lice treatments from 2008-2011 on Scottish salmon farms. This information was combined with the recently available data on sediment MECs for the years 2009-2012 provided by SEPA using ArcGIS 10.1. In depth analysis of this data showed that from a total of 55 sites, 30 sites had a MEC higher than the maximum allowable concentration (MAC) as set out by SEPA for emamectin benzoate and 7 sites had a higher MEC than MAC for teflubenzuron. A number of sites that were up to 16 km away from the nearest salmon farm reported as using either emamectin benzoate or teflubenzuron measured positive for the two treatments. There was no relationship between current direction and the distribution of the sea lice treatments, nor was there any evidence for alternative sources of the compounds e.g. land treatments. The sites that had MECs higher than the MAC could pose a risk to non-target organisms and disrupt the species dynamics of the area. There was evidence that some marine protected sites might be at risk of exposure to these compounds. To complement this work, effects on acute mixture toxicity of the 5 sea lice treatments, plus one major metabolite 3-phenoxybenzoic acid (3PBA), were measured using an assay using the bioluminescent bacteria Aliivibrio fischeri. When exposed to the 5 sea lice treatments and 3PBA A. fischeri showed a response to 3PBA, emamectin benzoate and azamethiphos as well as combinations of the three. In order to establish any additive effect of the sea lice treatments, the efficacy of two mixture prediction equations, concentration addition (CA) and independent action ii(IA) were tested using the results from single compound dose response curves. In this instance IA was the more effective prediction method with a linear regression confidence interval of 82.6% compared with 22.6% of CA. In silico molecular docking was carried out to predict the chronic effects of 15 VMs (including the five used as sea lice control). Molecular docking has been proposed as an alternative screening method for the chronic effects of large animal treatments on non-target organisms. Oestrogen receptor alpha (ERα) of 7 non-target bony fish and the African clawed frog Xenopus laevis were modelled using SwissModel. These models were then ‘docked’ to oestradiol, the synthetic oestrogen ethinylestradiol, two known xenoestrogens dichlorodiphenyltrichloroethane (DDT) and bisphenol A (BPA), the antioestrogen breast cancer treatment tamoxifen and 15 VMs using Auto Dock 4. Based on the results of this work, four VMs were identified as being possible xenoestrogens or anti-oestrogens; these were cypermethrin, deltamethrin, fenbendazole and teflubenzuron. Further investigation, using in vitro assays, into these four VMs has been suggested as future work. A modified recombinant yeast oestrogen screen (YES) was attempted using the cDNA of the ERα of the zebrafish Danio rerio and the rainbow trout Oncorhynchus mykiss. Due to time and difficulties in cloning protocols this work was unable to be completed. Use of such in vitro assays would allow for further investigation of the highlighted VMs into their oestrogenic potential. In conclusion, VMs used as sea lice treatments, such as teflubenzuron and emamectin benzoate may be more persistent and have a wider range in the environment than previously thought. Mixtures of sea lice treatments have been found to persist together in the environment, and effects of these mixtures on the bacteria A. fischeri can be predicted using the IA equation. Finally, molecular docking may be a suitable tool to predict chronic endocrine disrupting effects and identify varying degrees of impact on the ERα of nine species of aquatic organisms.
Resumo:
During the last twenty years (1995-2015), the world of commerce has expanded beyond the traditional brick-and-mortar high street to a global shop front accessible to billions of users via the Worldwide Web (WWW). Consumers are now using the web to immerse themselves in virtual shop fronts, using Social Media (SM) to communicate and share product ideas with friends and family. Retail organisations recognise the need to develop and adapt their strategies to respond to the increasing use of SM. New goals must be set in order to identify how companies will integrate social media into current practices. This research aims to suggest an advisable and comprehensive SM strategy for companies operating in the global retail sector, based on an exploratory analysis of three multi-national retail organisations' existing SM strategies. This will be assessed in conjunction with a broader investigation into social media in the retail industry. From this, a strategy will be devised to improve internal and external communication as well as knowledge management through the use of social media. Findings suggest that the use of SM within the retail industry has dramatically improved collaboration and communication processes for organisations as they are now able to converse better with stakeholders and the tools are relatively simple to integrate and implement as they benefit one another.
Resumo:
Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.
Resumo:
In this paper we analyze the set of Bronze Age bone tools recovered at the archaeological site of El Portalón of Cueva Mayor in the Sierra de Atapuerca (Burgos). The Bronze Age cultural period is the best represented in the cavity and its study has forced us to unify the different excavation and stratigraphical criteria undertaken from the earliest archaeological excavations developed by J.M. Apellániz during the 70s until the excavations of the current research team (EIA) since 2000. We propose here for the first time a relationship between the initial system of “beds” used by Apellániz and our recent sedimentary sequence that recognizes eleven stratigraphic levels radiometrically dated from the late Upper Pleistocene to the Middle Age. Within the bone industry assemblage we recognize a large variety of utensils and ornamental elements, with native and allochthonous features, that make evident a regional as well as long distance relationships of these populations of the interior of the Iberian Peninsula during the recent Prehistory.
Resumo:
In this paper, 36 English and 38 Spanish news articles were selected from English and Spanish newspapers and magazines published in the U.S.A. from August 2014 to November 2014. All articles discuss the death of Michael Brown, the ensuing protests and police investigations. A discourse analysis shows that there are few differences between reporting by the mainstream and the Hispanic media. Like the mainstream media, the Hispanic media adopts a neutral point of view with regard to the African-American minority. However, it presents a negative opinion with regard to the police. It appears that the Hispanic media does not explicitly side with the African-American community, but rather agrees more with the mainstream media’s opinion and is substantially influenced by it.
Resumo:
AIMS: Diagnosis of soft tissue sarcomas can be difficult. It can be aided by detection of specific genetic aberrations in many cases. This study assessed the utility of a molecular genetics/cytogenetics service as part of the routine diagnostic service at the Royal Marsden Hospital. METHODS: A retrospective audit was performed over a 15-month period to evaluate the diagnostic usefulness for soft tissue sarcomas with translocations of fluorescence in situ hybridisation (FISH) and reverse-transcriptase PCR (RT-PCR) in paraffin-embedded (PE) material. Results were compared with histology, and evaluated. RESULTS: Molecular investigations were performed on PE material in 158 samples (total 194 RT-PCR and 174 FISH tests), of which 85 were referral cases. Synovial sarcoma, Ewing sarcoma and low-grade fibromyxoid sarcoma were the most commonly tested tumours. Myxoid liposarcoma showed the best histological and molecular concordance, and alveolar rhabdomyosarcoma showed the best agreement between methods. FISH had a higher sensitivity for detecting tumours (73%, compared with 59% for RT-PCR) with a better success rate than RT-PCR, although the latter was specific in identifying the partner gene for each fusion. In particular, referral blocks in which methods of tissue fixation and processing were not certain resulted in higher RT-PCR failure rates. CONCLUSIONS: FISH and RT-PCR on PE tissue are practical and effective ancillary tools in the diagnosis of soft tissue sarcomas. They are useful in confirming doubtful histological diagnoses and excluding malignant diagnoses. PCR is less sensitive than FISH, and the use of both techniques is optimal for maximising the detection rate of translocation-positive sarcomas.
Resumo:
Background
Evidence-based practice advocates utilising best current research evidence, while reflecting patient preference and clinical expertise in decision making. Successfully incorporating this evidence into practice is a complex process. Based on recommendations of existing guidelines and systematic evidence reviews conducted using the GRADE approach, treatment pathways for common spinal pain disorders were developed.
Aims
The aim of this study was to identify important potential facilitators to the integration of these pathways into routine clinical practice.
Methods
A 22 person stakeholder group consisting of patient representatives, clinicians, researchers and members of relevant clinical interest groups took part in a series of moderated focus groups, followed up with individual, semi-structured interviews. Data were analysed using content analysis.
Results
Participants identified a number of issues which were categorized into broad themes. Common facilitators to implementation included continual education and synthesis of research evidence which is reflective of everyday practice; as well as the use of clear, unambiguous messages in recommendations. Meeting additional training needs in new or extended areas of practice was also recognized as an important factor. Different stakeholders identified specific areas which could be associated with successful uptake. Patients frequently defined early involvement in a shared decision making process as important. Clinicians identified case based examples and information on important prognostic indicators as useful tools to aiding decisions.
Conclusion
A number of potential implementation strategies were identified. Further work will examine the impact of these and other important factors on the integration of evidence-based treatment recommendations into clinical practice.
Resumo:
Android is becoming ubiquitous and currently has the largest share of the mobile OS market with billions of application downloads from the official app market. It has also become the platform most targeted by mobile malware that are becoming more sophisticated to evade state-of-the-art detection approaches. Many Android malware families employ obfuscation techniques in order to avoid detection and this may defeat static analysis based approaches. Dynamic analysis on the other hand may be used to overcome this limitation. Hence in this paper we propose DynaLog, a dynamic analysis based framework for characterizing Android applications. The framework provides the capability to analyse the behaviour of applications based on an extensive number of dynamic features. It provides an automated platform for mass analysis and characterization of apps that is useful for quickly identifying and isolating malicious applications. The DynaLog framework leverages existing open source tools to extract and log high level behaviours, API calls, and critical events that can be used to explore the characteristics of an application, thus providing an extensible dynamic analysis platform for detecting Android malware. DynaLog is evaluated using real malware samples and clean applications demonstrating its capabilities for effective analysis and detection of malicious applications.
Resumo:
Aflatoxins are a group of carcinogenic compounds produced by Aspergillus fungi that can grow on different agricultural crops. Both acute and chronic exposure to these mycotoxins can cause serious illness. Due to the high occurrence of aflatoxins in crops worldwide fast and cost-effective analytical methods are required for the identification of contaminated agricultural commodities before they are processed into final products and placed on the market. In order to provide new tools for aflatoxin screening two prototype fast ELISA methods: one for the detection of aflatoxin B1 and the other for total aflatoxins were developed. Seven monoclonal antibodies with unique high sensitivity and at the same time good cross-reactivity profiles were produced. The monoclonal antibodies were characterized and two antibodies showing IC50 of 0.037 ng/mL and 0.031 ng/mL for aflatoxin B1 were applied in simple and fast direct competitive ELISA tests. The methods were validated for peanut matrix as this crop is one of the most affected by aflatoxin contamination. The detection capabilities of aflatoxin B1 and total aflatoxins ELISAs were 0.4 μg/kg and 0.3 μg/kg for aflatoxin B1, respectively, which are one of the lowest reported values. Total aflatoxins ELISA was also validated for the detection of aflatoxins B2, G1 and G2. The application of the developed tests was demonstrated by screening 32 peanut samples collected from the UK retailers. Total aflatoxins ELISA was further applied to analyse naturally contaminated maize porridge and distiller's dried grain with solubles samples and the results were correlated with these obtained by UHPLC-MS/MS method.
Resumo:
Rotational moulding is a method to produce hollow plastic articles. Heating is normally carried out by placing the mould into a hot air oven where the plastic material in the mould is heated. The most common cooling media are water and forced air. Due to the inefficient nature of conventional hot air ovens most of the energy supplied by the oven does not go to heat the plastic and as a consequence the procedure has very long cycle times. Direct oil heating is an effective alternative in order to achieve better energy efficiency and cycle times. This research work has combined this technology with new innovative design of mould, applying the advantages of electroforming and rapid prototyping. Complex cavity geometries are manufactured by electroforming from a rapid prototyping mandrel. The approach involves conformal heating and cooling channels , where the oil flows into a parallel channel to the electroformed cavity (nickel or copper). Because of this the mould enables high temperature uniformity with direct heating and cooling of the electroformed shell, Uniform heating and cooling is important not only for good quality parts but also for good uniform wall thickness distribution in the rotationally moulded part. The experimental work with the manufactured prototype mould has enabled analysis of the thermal uniformity in the cavity, under different temperatures. Copyright © 2008 by ASME.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
In a world where students are increasing digitally tethered to powerful, ‘always on’ mobile devices, new models of engagement and approaches to teaching and learning are required from educators. Serious Games (SG) have proved to have instructional potential but there is still a lack of methodologies and tools not only for their design but also to support game analysis and assessment. This paper explores the use of SG to increase student engagement and retention. The development phase of the Circuit Warz game is presented to demonstrate how electronic engineering education can be radically reimagined to create immersive, highly engaging learning experiences that are problem-centered and pedagogically sound. The Learning Mechanics–Game Mechanics (LM-GM) framework for SG game analysis is introduced and its practical use in an educational game design scenario is shown as a case study.