906 resultados para Content analysis (Communication) -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal Concept Analysis is an unsupervised learning technique for conceptual clustering. We introduce the notion of iceberg concept lattices and show their use in Knowledge Discovery in Databases (KDD). Iceberg lattices are designed for analyzing very large databases. In particular they serve as a condensed representation of frequent patterns as known from association rule mining. In order to show the interplay between Formal Concept Analysis and association rule mining, we discuss the algorithm TITANIC. We show that iceberg concept lattices are a starting point for computing condensed sets of association rules without loss of information, and are a visualization method for the resulting rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among many other knowledge representations formalisms, Ontologies and Formal Concept Analysis (FCA) aim at modeling ‘concepts’. We discuss how these two formalisms may complement another from an application point of view. In particular, we will see how FCA can be used to support Ontology Engineering, and how ontologies can be exploited in FCA applications. The interplay of FCA and ontologies is studied along the life cycle of an ontology: (i) FCA can support the building of the ontology as a learning technique. (ii) The established ontology can be analyzed and navigated by using techniques of FCA. (iii) Last but not least, the ontology may be used to improve an FCA application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A key argument for modeling knowledge in ontologies is the easy re-use and re-engineering of the knowledge. However, beside consistency checking, current ontology engineering tools provide only basic functionalities for analyzing ontologies. Since ontologies can be considered as (labeled, directed) graphs, graph analysis techniques are a suitable answer for this need. Graph analysis has been performed by sociologists for over 60 years, and resulted in the vivid research area of Social Network Analysis (SNA). While social network structures in general currently receive high attention in the Semantic Web community, there are only very few SNA applications up to now, and virtually none for analyzing the structure of ontologies. We illustrate in this paper the benefits of applying SNA to ontologies and the Semantic Web, and discuss which research topics arise on the edge between the two areas. In particular, we discuss how different notions of centrality describe the core content and structure of an ontology. From the rather simple notion of degree centrality over betweenness centrality to the more complex eigenvector centrality based on Hermitian matrices, we illustrate the insights these measures provide on two ontologies, which are different in purpose, scope, and size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Germany and other European countries piglets are routinely castrated in order to avoid the occurrence of boar taint, an off-flavour and off-odour of pork. Sensory perception of boar taint varies; however, it is regarded as very unpleasant by many people. Surgical castration which is an effective means against boar taint has commonly been performed without anaesthesia or analgesia within the piglets’ first seven days of life. Piglet castration without anaesthesia has been heavily criticised, as the assumption that young piglets perceive less pain than older animals cannot be supported by scientific evidence. Consequently, surgical castration is only allowed with anaesthesia and/or analgesia in organic farming throughout the European Union since January 2012. Abandoning piglet castration without pain relief requires the implementation of alternative methods which improve animal welfare while maintaining sensory meat quality. There are three relevant alternatives: castration with anaesthesia and/or analgesia to reduce pain, a vaccination against boar taint (immunocastration) and the fattening of uncastrated male pigs (fattening of boars) combined with measures to reduce and detect boar taint in meat. Consumers’ attitudes and opinions regarding the alternatives are an important factor with regard to the implementation of alternatives, as they are finally supposed to buy the meat. The objective of this dissertation was to explore organic consumers’ attitudes, preferences and willingness-to-pay regarding piglet castration without pain relief and the three alternatives. Important aspects for the evaluation of the alternatives and influencing factors (e.g. information, taste) on preferences and willingness-to-pay should also be identified. In autumn 2009 nine focus group discussions were conducted each followed by a Vickrey auction including a tasting of boar salami. Overall, 89 consumers of organic pork participated in the study. Information on piglet castration and alternatives (in three variants) was provided as a basis for discussion. The focus group data were analysed using qualitative content analysis. In order to compare the focus group results with those from the auctions, an innovative approach applying an adapted scoring model to further analyse the data set was used. The majority of participants were not aware that piglets are castrated without anaesthesia in organic farming. They reacted shocked and disappointed on learning about this practice which did not fit into their image of animal welfare standards in organic farming. Overall, the results show, that for consumers of organic pork castration with anaesthesia and analgesia as well as the fattening of boars may be acceptable alternatives in organic farming. Considering the strong food safety concerns regarding immunocastration, acceptance of this alternative may be questioned. Communication regarding alternatives to piglet castration without anaesthesia and analgesia should take into account that the relevance of the aspects animal welfare, food safety, taste and costs differs between alternatives. Furthermore, it seems advisable not to address an unappetizing topic like piglet castration directly at the point of sale so as not to deter consumers from buying organic pork. The issue of piglet castration demonstrates exemplarily that it is important for the organic sector to implement and maintain high animal welfare standards and communicate them in an appropriate way, thereby trying to prevent strong discrepancies between consumers’ expectations regarding animal husbandry in organic farming and actual conditions. So, disappointment of consumers and a loss of image due to negative reports about animal welfare issues can be avoided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A presentation on the collection and analysis of data taken from SOES 6018. This module aims to ensure that MSc Oceanography, MSc Marine Science, Policy & Law and MSc Marine Resource Management students are equipped with the skills they need to function as professional marine scientists, in addition to / in conjuction with the skills training in other MSc modules. The module covers training in fieldwork techniques, communication & research skills, IT & data analysis and professional development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To determine the effect of human papillomavirus (HPV) quadrivalent vaccine on the risk of developing subsequent disease after an excisional procedure for cervical intraepithelial neoplasia or diagnosis of genital warts, vulvar intraepithelial neoplasia, or vaginal intraepithelial neoplasia. Design Retrospective analysis of data from two international, double blind, placebo controlled, randomised efficacy trials of quadrivalent HPV vaccine (protocol 013 (FUTURE I) and protocol 015 (FUTURE II)). Setting Primary care centres and university or hospital associated health centres in 24 countries and territories around the world. Participants Among 17 622 women aged 15–26 years who underwent 1:1 randomisation to vaccine or placebo, 2054 received cervical surgery or were diagnosed with genital warts, vulvar intraepithelial neoplasia, or vaginal intraepithelial neoplasia. Intervention Three doses of quadrivalent HPV vaccine or placebo at day 1, month 2, and month 6. Main outcome measures Incidence of HPV related disease from 60 days after treatment or diagnosis, expressed as the number of women with an end point per 100 person years at risk. Results A total of 587 vaccine and 763 placebo recipients underwent cervical surgery. The incidence of any subsequent HPV related disease was 6.6 and 12.2 in vaccine and placebo recipients respectively (46.2% reduction (95% confidence interval 22.5% to 63.2%) with vaccination). Vaccination was associated with a significant reduction in risk of any subsequent high grade disease of the cervix by 64.9% (20.1% to 86.3%). A total of 229 vaccine recipients and 475 placebo recipients were diagnosed with genital warts, vulvar intraepithelial neoplasia, or vaginal intraepithelial neoplasia, and the incidence of any subsequent HPV related disease was 20.1 and 31.0 in vaccine and placebo recipients respectively (35.2% reduction (13.8% to 51.8%)). Conclusions Previous vaccination with quadrivalent HPV vaccine among women who had surgical treatment for HPV related disease significantly reduced the incidence of subsequent HPV related disease, including high grade disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social network has gained remarkable attention in the last decade. Accessing social network sites such as Twitter, Facebook LinkedIn and Google+ through the internet and the web 2.0 technologies has become more affordable. People are becoming more interested in and relying on social network for information, news and opinion of other users on diverse subject matters. The heavy reliance on social network sites causes them to generate massive data characterised by three computational issues namely; size, noise and dynamism. These issues often make social network data very complex to analyse manually, resulting in the pertinent use of computational means of analysing them. Data mining provides a wide range of techniques for detecting useful knowledge from massive datasets like trends, patterns and rules [44]. Data mining techniques are used for information retrieval, statistical modelling and machine learning. These techniques employ data pre-processing, data analysis, and data interpretation processes in the course of data analysis. This survey discusses different data mining techniques used in mining diverse aspects of the social network over decades going from the historical techniques to the up-to-date models, including our novel technique named TRCM. All the techniques covered in this survey are listed in the Table.1 including the tools employed as well as names of their authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a new parametric method to deal with discrepant experimental results is developed. The method is based on the fit of a probability density function to the data. This paper also compares the characteristics of different methods used to deduce recommended values and uncertainties from a discrepant set of experimental data. The methods are applied to the (137)Cs and (90)Sr published half-lives and special emphasis is given to the deduced confidence intervals. The obtained results are analyzed considering two fundamental properties expected from an experimental result: the probability content of confidence intervals and the statistical consistency between different recommended values. The recommended values and uncertainties for the (137)Cs and (90)Sr half-lives are 10,984 (24) days and 10,523 (70) days, respectively. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Through a national policy agreement, over 167 million Euros will be invested in the Swedish National Quality Registries (NQRs) between 2012 and 2016. One of the policy agreement¿s intentions is to increase the use of NQR data for quality improvement (QI). However, the evidence is fragmented as to how the use of medical registries and the like lead to quality improvement, and little is known about non-clinical use. The aim was therefore to investigate the perspectives of Swedish politicians and administrators on quality improvement based on national registry data. Methods. Politicians and administrators from four county councils were interviewed. A qualitative content analysis guided by the Consolidated Framework for Implementation Research (CFIR) was performed. Results. The politicians and administrators perspectives on the use of NQR data for quality improvement were mainly assigned to three of the five CFIR domains. In the domain of intervention characteristics, data reliability and access in reasonable time were not considered entirely satisfactory, making it difficult for the politico-administrative leaderships to initiate, monitor, and support timely QI efforts. Still, politicians and administrators trusted the idea of using the NQRs as a base for quality improvement. In the domain of inner setting, the organizational structures were not sufficiently developed to utilize the advantages of the NQRs, and readiness for implementation appeared to be inadequate for two reasons. Firstly, the resources for data analysis and quality improvement were not considered sufficient at politico-administrative or clinical level. Secondly, deficiencies in leadership engagement at multiple levels were described and there was a lack of consensus on the politicians¿ role and level of involvement. Regarding the domain of outer setting, there was a lack of communication and cooperation between the county councils and the national NQR organizations. Conclusions. The Swedish experiences show that a government-supported national system of well-funded, well-managed, and reputable national quality registries needs favorable local politico-administrative conditions to be used for quality improvement; such conditions are not yet in place according to local politicians and administrators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cassava starch has been shown to make transparent and colorless flexible films without any previous chemical treatment. The functional properties of edible films are influenced by starch properties, including chain conformation, molecular bonding, crystallinity, and water content. Fourier-transform infrared (FTIR) spectroscopy in combination with attenuated total reflectance (ATR) has been applied for the elucidation of the structure and conformation of carbohydrates. This technique associated with chemometric data processing could indicate the relationship between the structural parameters and the functional properties of cassava starch-based edible films. Successful prediction of the functional properties values of the starch-based films was achieved by partial least squares regression data. The results showed that presence of the hydroxyl group on carbon 6 of the cyclic part of glucose is directly correlated with the functional properties of cassava starch films.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper adresses the problem on processing biological data such as cardiac beats, audio and ultrasonic range, calculating wavelet coefficients in real time, with processor clock running at frequency of present ASIC's and FPGA. The Paralell Filter Architecture for DWT has been improved, calculating wavelet coefficients in real time with hardware reduced to 60%. The new architecture, which also processes IDWT, is implemented with the Radix-2 or the Booth-Wallace Constant multipliers. Including series memory register banks, one integrated circuit Signal Analyzer, ultrasonic range, is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Enfermagem (mestrado profissional) - FMB

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Except the article forming the main content most HTML documents on the WWW contain additional contents such as navigation menus, design elements or commercial banners. In the context of several applications it is necessary to draw the distinction between main and additional content automatically. Content extraction and template detection are the two approaches to solve this task. This thesis gives an extensive overview of existing algorithms from both areas. It contributes an objective way to measure and evaluate the performance of content extraction algorithms under different aspects. These evaluation measures allow to draw the first objective comparison of existing extraction solutions. The newly introduced content code blurring algorithm overcomes several drawbacks of previous approaches and proves to be the best content extraction algorithm at the moment. An analysis of methods to cluster web documents according to their underlying templates is the third major contribution of this thesis. In combination with a localised crawling process this clustering analysis can be used to automatically create sets of training documents for template detection algorithms. As the whole process can be automated it allows to perform template detection on a single document, thereby combining the advantages of single and multi document algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.