935 resultados para Analysis Tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, large numbers of social media services have emerged and been widely used in people's daily life as important information sharing and acquisition tools. With a substantial amount of user-contributed text data on social media, it becomes a necessity to develop methods and tools for text analysis for this emerging data, in order to better utilize it to deliver meaningful information to users. Previous work on text analytics in last several decades is mainly focused on traditional types of text like emails, news and academic literatures, and several critical issues to text data on social media have not been well explored: 1) how to detect sentiment from text on social media; 2) how to make use of social media's real-time nature; 3) how to address information overload for flexible information needs. In this dissertation, we focus on these three problems. First, to detect sentiment of text on social media, we propose a non-negative matrix tri-factorization (tri-NMF) based dual active supervision method to minimize human labeling efforts for the new type of data. Second, to make use of social media's real-time nature, we propose approaches to detect events from text streams on social media. Third, to address information overload for flexible information needs, we propose two summarization framework, dominating set based summarization framework and learning-to-rank based summarization framework. The dominating set based summarization framework can be applied for different types of summarization problems, while the learning-to-rank based summarization framework helps utilize the existing training data to guild the new summarization tasks. In addition, we integrate these techneques in an application study of event summarization for sports games as an example of how to better utilize social media data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite-Differences Time-Domain (FDTD) algorithms are well established tools of computational electromagnetism. Because of their practical implementation as computer codes, they are affected by many numerical artefact and noise. In order to obtain better results we propose using Principal Component Analysis (PCA) based on multivariate statistical techniques. The PCA has been successfully used for the analysis of noise and spatial temporal structure in a sequence of images. It allows a straightforward discrimination between the numerical noise and the actual electromagnetic variables, and the quantitative estimation of their respective contributions. Besides, The GDTD results can be filtered to clean the effect of the noise. In this contribution we will show how the method can be applied to several FDTD simulations: the propagation of a pulse in vacuum, the analysis of two-dimensional photonic crystals. In this last case, PCA has revealed hidden electromagnetic structures related to actual modes of the photonic crystal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology provides a range of tools which facilitate parts of the process of reading, analysis and writing in humanities, but these tools are limited and poorly integrated. Methods of providing students with the skills to make good use of a range of tools to create an integrated, structured process of writing in the disciplines are examined, compared and critiqued. Tools for mindmapping and outlining are examined both as reading tools and as tools to structure knowledge and explore ontology creation. Interoperability between these and common wordprocessors is examined in order to explore how students may be taught to develop a structured research and writing process using currently available tools. Requirements for future writing tools are suggested

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the role that DNA damage and depletion play in human disease, it is important to develop and improve tools to assess these endpoints. This unit describes PCR-based methods to measure nuclear and mitochondrial DNA damage and copy number. Long amplicon quantitative polymerase chain reaction (LA-QPCR) is used to detect DNA damage by measuring the number of polymerase-inhibiting lesions present based on the amount of PCR amplification; real-time PCR (RT-PCR) is used to calculate genome content. In this unit, we provide step-by-step instructions to perform these assays in Homo sapiens, Mus musculus, Rattus norvegicus, Caenorhabditis elegans, Drosophila melanogaster, Danio rerio, Oryzias latipes, Fundulus grandis, and Fundulus heteroclitus, and discuss the advantages and disadvantages of these assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first edition of Global Value Chain Analysis: A Primer was released five years ago (May 2011) in order to provide an overview of the key concepts and methodological tools used by Duke University’s Center on Globalization, Governance & Competitiveness (Duke CGGC) a university-based research center that focuses on innovative applications of the GVC framework, which was developed by Duke CGGC’s founding director, Gary Gereffi. The Second Edition of Global Value Chain Analysis: A Primer (July 2016) retains a simple, expository style and use of recent research examples in order to offer an entry point for those wishing to better understand and use the GVC framework as a tool to analyze how local actors (firms, communities, workers) are linked to and affected by major transformations in the global economy. The GVC framework focuses on structural shifts in global industries, anchored by the core concepts of governance and upgrading. This Second Edition highlights some of the refinements in these concepts, and introduces a number of new illustrations drawing from recent Duke CGGC research. The bibliography offers a sampling of the broad array of studies available on the Duke CGGC website and in related academic publications. We hope this work stimulates continued interest in and use of the GVC framework as a tool to promote more dynamic, inclusive and sustainable development outcomes for all economies and the local actors within them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

© 2016 Springer Science+Business Media New YorkResearchers studying mammalian dentitions from functional and adaptive perspectives increasingly have moved towards using dental topography measures that can be estimated from 3D surface scans, which do not require identification of specific homologous landmarks. Here we present molaR, a new R package designed to assist researchers in calculating four commonly used topographic measures: Dirichlet Normal Energy (DNE), Relief Index (RFI), Orientation Patch Count (OPC), and Orientation Patch Count Rotated (OPCR) from surface scans of teeth, enabling a unified application of these informative new metrics. In addition to providing topographic measuring tools, molaR has complimentary plotting functions enabling highly customizable visualization of results. This article gives a detailed description of the DNE measure, walks researchers through installing, operating, and troubleshooting molaR and its functions, and gives an example of a simple comparison that measured teeth of the primates Alouatta and Pithecia in molaR and other available software packages. molaR is a free and open source software extension, which can be found at the doi:10.13140/RG.2.1.3563.4961(molaR v. 2.0) as well as on the Internet repository CRAN, which stores R packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developing innovative interventions that are in sync with a health promotion paradigm often represents a challenge for professionals working in local public health organizations. Thus, it is critical to have both professional development programs that favor new practices and tools to examine these practices. In this case study, we analyze the health promotion approach used in a pilot intervention addressing children’s vulnerability that was developed and carried out by participants enrolled in a public health professional development program. More specifically, we use a modified version of Guichard and Ridde’s (Une grille d’analyse des actions pour lutter contre les inégalités sociales de santé. In Potvin, L., Moquet, M.-J. and Jones, C. M. (eds), Réduire les Inégalités Sociales en Santé. INPES, Saint-Denis Cedex, pp. 297– 312, 2010) analytical grid to assess deductively the program participants’ use of health promotion practices in the analysis and planning, implementation, evaluation, sustainability and empowerment phases of the pilot intervention. We also seek evidence of practices involving (empowerment, participation, equity, holism, an ecological approach, intersectorality and sustainability) in the intervention. The results are mixed: our findings reveal evidence of the application of several dimensions of health promotion (equity, holism, an ecological approach, intersectorality and sustainability), but also a lack of integration of two key dimensions; that is, empowerment and participation, during various phases of the pilot intervention. These results show that the professional development program is associated with the adoption of a pilot intervention integrating multiple but not all dimensions of health promotion. We make recommendations to facilitate a more complete integration. This research also shows that the Guichard and Ridde grid proves to be a thorough instrument to document the practices of participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Veterinary medicines (VMs) from agricultural industry can enter the environment in a number of ways. This includes direct exposure through aquaculture, accidental spillage and disposal, and indirect entry by leaching from manure or runoff after treatment. Many compounds used in animal treatments have ecotoxic properties that may have chronic or sometimes lethal effects when they come into contact with non-target organisms. VMs enter the environment in mixtures, potentially having additive effects. Traditional ecotoxicology tests are used to determine the lethal and sometimes reproductive effects on freshwater and terrestrial organisms. However, organisms used in ecotoxicology tests can be unrepresentative of the populations that are likely to be exposed to the compound in the environment. Most often the tests are on single compound toxicity but mixture effects may be significant and should be included in ecotoxicology testing. This work investigates the use, measured environmental concentrations (MECs) and potential impact of sea lice treatments on salmon farms in Scotland. Alternative methods for ecotoxicology testing including mixture toxicity, and the use of in silico techniques to predict the chronic impact of VMs on different species of aquatic organisms were also investigated. The Scottish Environmental Protection Agency (SEPA) provided information on the use of five sea lice treatments from 2008-2011 on Scottish salmon farms. This information was combined with the recently available data on sediment MECs for the years 2009-2012 provided by SEPA using ArcGIS 10.1. In depth analysis of this data showed that from a total of 55 sites, 30 sites had a MEC higher than the maximum allowable concentration (MAC) as set out by SEPA for emamectin benzoate and 7 sites had a higher MEC than MAC for teflubenzuron. A number of sites that were up to 16 km away from the nearest salmon farm reported as using either emamectin benzoate or teflubenzuron measured positive for the two treatments. There was no relationship between current direction and the distribution of the sea lice treatments, nor was there any evidence for alternative sources of the compounds e.g. land treatments. The sites that had MECs higher than the MAC could pose a risk to non-target organisms and disrupt the species dynamics of the area. There was evidence that some marine protected sites might be at risk of exposure to these compounds. To complement this work, effects on acute mixture toxicity of the 5 sea lice treatments, plus one major metabolite 3-phenoxybenzoic acid (3PBA), were measured using an assay using the bioluminescent bacteria Aliivibrio fischeri. When exposed to the 5 sea lice treatments and 3PBA A. fischeri showed a response to 3PBA, emamectin benzoate and azamethiphos as well as combinations of the three. In order to establish any additive effect of the sea lice treatments, the efficacy of two mixture prediction equations, concentration addition (CA) and independent action ii(IA) were tested using the results from single compound dose response curves. In this instance IA was the more effective prediction method with a linear regression confidence interval of 82.6% compared with 22.6% of CA. In silico molecular docking was carried out to predict the chronic effects of 15 VMs (including the five used as sea lice control). Molecular docking has been proposed as an alternative screening method for the chronic effects of large animal treatments on non-target organisms. Oestrogen receptor alpha (ERα) of 7 non-target bony fish and the African clawed frog Xenopus laevis were modelled using SwissModel. These models were then ‘docked’ to oestradiol, the synthetic oestrogen ethinylestradiol, two known xenoestrogens dichlorodiphenyltrichloroethane (DDT) and bisphenol A (BPA), the antioestrogen breast cancer treatment tamoxifen and 15 VMs using Auto Dock 4. Based on the results of this work, four VMs were identified as being possible xenoestrogens or anti-oestrogens; these were cypermethrin, deltamethrin, fenbendazole and teflubenzuron. Further investigation, using in vitro assays, into these four VMs has been suggested as future work. A modified recombinant yeast oestrogen screen (YES) was attempted using the cDNA of the ERα of the zebrafish Danio rerio and the rainbow trout Oncorhynchus mykiss. Due to time and difficulties in cloning protocols this work was unable to be completed. Use of such in vitro assays would allow for further investigation of the highlighted VMs into their oestrogenic potential. In conclusion, VMs used as sea lice treatments, such as teflubenzuron and emamectin benzoate may be more persistent and have a wider range in the environment than previously thought. Mixtures of sea lice treatments have been found to persist together in the environment, and effects of these mixtures on the bacteria A. fischeri can be predicted using the IA equation. Finally, molecular docking may be a suitable tool to predict chronic endocrine disrupting effects and identify varying degrees of impact on the ERα of nine species of aquatic organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last twenty years (1995-2015), the world of commerce has expanded beyond the traditional brick-and-mortar high street to a global shop front accessible to billions of users via the Worldwide Web (WWW). Consumers are now using the web to immerse themselves in virtual shop fronts, using Social Media (SM) to communicate and share product ideas with friends and family. Retail organisations recognise the need to develop and adapt their strategies to respond to the increasing use of SM. New goals must be set in order to identify how companies will integrate social media into current practices. This research aims to suggest an advisable and comprehensive SM strategy for companies operating in the global retail sector, based on an exploratory analysis of three multi-national retail organisations' existing SM strategies. This will be assessed in conjunction with a broader investigation into social media in the retail industry. From this, a strategy will be devised to improve internal and external communication as well as knowledge management through the use of social media. Findings suggest that the use of SM within the retail industry has dramatically improved collaboration and communication processes for organisations as they are now able to converse better with stakeholders and the tools are relatively simple to integrate and implement as they benefit one another.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we analyze the set of Bronze Age bone tools recovered at the archaeological site of El Portalón of Cueva Mayor in the Sierra de Atapuerca (Burgos). The Bronze Age cultural period is the best represented in the cavity and its study has forced us to unify the different excavation and stratigraphical criteria undertaken from the earliest archaeological excavations developed by J.M. Apellániz during the 70s until the excavations of the current research team (EIA) since 2000. We propose here for the first time a relationship between the initial system of “beds” used by Apellániz and our recent sedimentary sequence that recognizes eleven stratigraphic levels radiometrically dated from the late Upper Pleistocene to the Middle Age. Within the bone industry assemblage we recognize a large variety of utensils and ornamental elements, with native and allochthonous features, that make evident a regional as well as long distance relationships of these populations of the interior of the Iberian Peninsula during the recent Prehistory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, 36 English and 38 Spanish news articles were selected from English and Spanish newspapers and magazines published in the U.S.A. from August 2014 to November 2014. All articles discuss the death of Michael Brown, the ensuing protests and police investigations. A discourse analysis shows that there are few differences between reporting by the mainstream and the Hispanic media. Like the mainstream media, the Hispanic media adopts a neutral point of view with regard to the African-American minority. However, it presents a negative opinion with regard to the police. It appears that the Hispanic media does not explicitly side with the African-American community, but rather agrees more with the mainstream media’s opinion and is substantially influenced by it.