53 resultados para temporal visualization techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, novel methodologies for the determination of antioxidative compounds in herbs and beverages were developed. Antioxidants are compounds that can reduce, delay or inhibit oxidative events. They are a part of the human defense system and are obtained through the diet. Antioxidants are naturally present in several types of foods, e.g. in fruits, beverages, vegetables and herbs. Antioxidants can also be added to foods during manufacturing to suppress lipid oxidation and formation of free radicals under conditions of cooking or storage and to reduce the concentration of free radicals in vivo after food ingestion. There is growing interest in natural antioxidants, and effective compounds have already been identified from antioxidant classes such as carotenoids, essential oils, flavonoids and phenolic acids. The wide variety of sample matrices and analytes presents quite a challenge for the development of analytical techniques. Growing demands have been placed on sample pretreatment. In this study, three novel extraction techniques, namely supercritical fluid extraction (SFE), pressurised hot water extraction (PHWE) and dynamic sonication-assisted extraction (DSAE) were studied. SFE was used for the extraction of lycopene from tomato skins and PHWE was used in the extraction of phenolic compounds from sage. DSAE was applied to the extraction of phenolic acids from Lamiaceae herbs. In the development of extraction methodologies, the main parameters of the extraction were studied and the recoveries were compared to those achieved by conventional extraction techniques. In addition, the stability of lycopene was also followed under different storage conditions. For the separation of the antioxidative compounds in the extracts, liquid chromatographic methods (LC) were utilised. Two novel LC techniques, namely ultra performance liquid chromatography (UPLC) and comprehensive two-dimensional liquid chromatography (LCxLC) were studied and compared with conventional high performance liquid chromatography (HPLC) for the separation of antioxidants in beverages and Lamiaceae herbs. In LCxLC, the selection of LC mode, column dimensions and flow rates were studied and optimised to obtain efficient separation of the target compounds. In addition, the separation powers of HPLC, UPLC, HPLCxHPLC and HPLCxUPLC were compared. To exploit the benefits of an integrated system, in which sample preparation and final separation are performed in a closed unit, dynamic sonication-assisted extraction was coupled on-line to a liquid chromatograph via a solid-phase trap. The increased sensitivity was utilised in the extraction of phenolic acids from Lamiaceae herbs. The results were compared to those of achieved by the LCxLC system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Being at the crossroads of the Old World continents, Western Asia has a unique position through which the dispersal and migration of mammals and the interaction of faunal bioprovinces occurred. Despite its critical position, the record of Miocene mammals in Western Asia is sporadic and there are large spatial and temporal gaps between the known fossil localities. Although the development of the mammalian faunas in the Miocene of the Old World is well known and there is ample evidence for environmental shifts in this epoch, efforts toward quantification of habitat changes and development of chronofaunas based on faunal compositions were mostly neglected. Advancement of chronological, paleoclimatological, and paleogeographical reconstruction tools and techniques and increased numbers of new discoveries in recent decades have brought the need for updating and modification of our level of understanding. We under took fieldwork and systematic study of mammalian trace and body fossils from the northwestern parts of Iran along with analysis of large mammal data from the NOW database. The data analysis was used to study the provinciality, relative abundance, and distribution history of the closed- and open-adapted taxa and chronofaunas in the Miocene of the Old World and Western Asia. The provinciality analysis was carried out, using locality clustering, and the relative abundance of the closed- and open-adapted taxa was surveyed at the family level. The distribution history of the chronofaunas was studied, using faunal resemblance indices and new mapping techniques, together with humidity analysis based on mean ordinated hypsodonty. Paleoichnological studies revealed the abundance of mammalian footprints in several parts of the basins studied, which are normally not fossiliferous in terms of body fossils. The systematic study and biochronology of the newly discovered mammalian fossils in northwestern Iran indicates their close affinities with middle Turolian faunas. Large cranial remains of hipparionine horses, previously unknown in Iran and Western Asia, are among the material studied. The initiation of a new field project in the famous Maragheh locality also brings new opportunities to address questions regarding the chronology and paleoenvironment of this classical site. Provinciality analysis modified our previous level of understandings, indicating the interaction of four provinces in Western Asia. The development of these provinces was apparently due to the presence of high mountain ranges in the area, which affected the dispersal of mammals and also climatic patterns. Higher temperatures and possibly higher co2 levels in the Middle Miocene Climatic Optimum apparently favored the development of the closed forested environments that supported the dominance of the closed-adapted taxa. The increased seasonality and the progressive cooling and drying of the midlatitudes toward the Late Miocene maintained the dominance of open-adapted faunas. It appears that the late Middle Miocene was the time of transition from a more forested to a less forested world. The distribution history of the closed- and open-adapted chronofaunas shows the presence of cosmopolitan and endemic faunas in Western Asia. The closed-adapted faunas, such as the Arabian chronofauna of the late Early‒early Middle Miocene, demonstrated a rapid buildup and gradual decline. The open-adapted chronofaunas, such as the Late Miocene Maraghean fauna, climaxed gradually by filling the opening environments and moving in response to changes in humidity patterns. They abruptly declined due to demise of their favored environments. The Siwalikan chronofauna of the early Late Miocene remained endemic and restricted through all its history. This study highlights the importance of field investigations and indicates that new surveys in the vast areas of Western Asia, which are poorly sampled in terms of fossil mammal localities, can still be promising. Clustering of the localities supports the consistency of formerly known patterns and augments them. Although the quantitative approach to relative abundance history of the closed- and open-adapted mammals harks back to more than half a century ago, it is a novel technique providing robust results. Tracking the history of the chronofaunas in space and time by means of new computational and illustration methods is also a new practice that can be expanded to new areas and time spans.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Taita Hills in southeastern Kenya form the northernmost part of Africa’s Eastern Arc Mountains, which have been identified by Conservation International as one of the top ten biodiversity hotspots on Earth. As with many areas of the developing world, over recent decades the Taita Hills have experienced significant population growth leading to associated major changes in land use and land cover (LULC), as well as escalating land degradation, particularly soil erosion. Multi-temporal medium resolution multispectral optical satellite data, such as imagery from the SPOT HRV, HRVIR, and HRG sensors, provides a valuable source of information for environmental monitoring and modelling at a landscape level at local and regional scales. However, utilization of multi-temporal SPOT data in quantitative remote sensing studies requires the removal of atmospheric effects and the derivation of surface reflectance factor. Furthermore, for areas of rugged terrain, such as the Taita Hills, topographic correction is necessary to derive comparable reflectance throughout a SPOT scene. Reliable monitoring of LULC change over time and modelling of land degradation and human population distribution and abundance are of crucial importance to sustainable development, natural resource management, biodiversity conservation, and understanding and mitigating climate change and its impacts. The main purpose of this thesis was to develop and validate enhanced processing of SPOT satellite imagery for use in environmental monitoring and modelling at a landscape level, in regions of the developing world with limited ancillary data availability. The Taita Hills formed the application study site, whilst the Helsinki metropolitan region was used as a control site for validation and assessment of the applied atmospheric correction techniques, where multiangular reflectance field measurements were taken and where horizontal visibility meteorological data concurrent with image acquisition were available. The proposed historical empirical line method (HELM) for absolute atmospheric correction was found to be the only applied technique that could derive surface reflectance factor within an RMSE of < 0.02 ps in the SPOT visible and near-infrared bands; an accuracy level identified as a benchmark for successful atmospheric correction. A multi-scale segmentation/object relationship modelling (MSS/ORM) approach was applied to map LULC in the Taita Hills from the multi-temporal SPOT imagery. This object-based procedure was shown to derive significant improvements over a uni-scale maximum-likelihood technique. The derived LULC data was used in combination with low cost GIS geospatial layers describing elevation, rainfall and soil type, to model degradation in the Taita Hills in the form of potential soil loss, utilizing the simple universal soil loss equation (USLE). Furthermore, human population distribution and abundance were modelled with satisfactory results using only SPOT and GIS derived data and non-Gaussian predictive modelling techniques. The SPOT derived LULC data was found to be unnecessary as a predictor because the first and second order image texture measurements had greater power to explain variation in dwelling unit occurrence and abundance. The ability of the procedures to be implemented locally in the developing world using low-cost or freely available data and software was considered. The techniques discussed in this thesis are considered equally applicable to other medium- and high-resolution optical satellite imagery, as well the utilized SPOT data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis which consists of an introduction and four peer-reviewed original publications studies the problems of haplotype inference (haplotyping) and local alignment significance. The problems studied here belong to the broad area of bioinformatics and computational biology. The presented solutions are computationally fast and accurate, which makes them practical in high-throughput sequence data analysis. Haplotype inference is a computational problem where the goal is to estimate haplotypes from a sample of genotypes as accurately as possible. This problem is important as the direct measurement of haplotypes is difficult, whereas the genotypes are easier to quantify. Haplotypes are the key-players when studying for example the genetic causes of diseases. In this thesis, three methods are presented for the haplotype inference problem referred to as HaploParser, HIT, and BACH. HaploParser is based on a combinatorial mosaic model and hierarchical parsing that together mimic recombinations and point-mutations in a biologically plausible way. In this mosaic model, the current population is assumed to be evolved from a small founder population. Thus, the haplotypes of the current population are recombinations of the (implicit) founder haplotypes with some point--mutations. HIT (Haplotype Inference Technique) uses a hidden Markov model for haplotypes and efficient algorithms are presented to learn this model from genotype data. The model structure of HIT is analogous to the mosaic model of HaploParser with founder haplotypes. Therefore, it can be seen as a probabilistic model of recombinations and point-mutations. BACH (Bayesian Context-based Haplotyping) utilizes a context tree weighting algorithm to efficiently sum over all variable-length Markov chains to evaluate the posterior probability of a haplotype configuration. Algorithms are presented that find haplotype configurations with high posterior probability. BACH is the most accurate method presented in this thesis and has comparable performance to the best available software for haplotype inference. Local alignment significance is a computational problem where one is interested in whether the local similarities in two sequences are due to the fact that the sequences are related or just by chance. Similarity of sequences is measured by their best local alignment score and from that, a p-value is computed. This p-value is the probability of picking two sequences from the null model that have as good or better best local alignment score. Local alignment significance is used routinely for example in homology searches. In this thesis, a general framework is sketched that allows one to compute a tight upper bound for the p-value of a local pairwise alignment score. Unlike the previous methods, the presented framework is not affeced by so-called edge-effects and can handle gaps (deletions and insertions) without troublesome sampling and curve fitting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study addresses the following question: How to think about ethics in a technological world? The question is treated first thematically by framing central issues in the relationship between ethics and technology. This relationship has three distinct facets: i) technological advance poses new challenges for ethics, ii) traditional ethics may become poorly applicable in a technologically transformed world, and iii) the progress in science and technology has altered the concept of rationality in ways that undermine ethical thinking itself. The thematic treatment is followed by the description and analysis of three approaches to the questions framed. First, Hans Jonas s thinking on the ontology of life and the imperative of responsibility is studied. In Jonas s analysis modern culture is found to be nihilistic because it is unable to understand organic life, to find meaning in reality, and to justify morals. At the root of nihilism Jonas finds dualism, the traditional Western way of seeing consciousness as radically separate from the material world. Jonas attempts to create a metaphysical grounding for an ethic that would take the technologically increased human powers into account and make the responsibility for future generations meaningful and justified. The second approach is Albert Borgmann s philosophy of technology that mainly assesses the ways in which technological development has affected everyday life. Borgmann admits that modern technology has liberated humans from toil, disease, danger, and sickness. Furthermore, liberal democracy, possibilities for self-realization, and many of the freedoms we now enjoy would not be possible on a large scale without technology. Borgmann, however, argues that modern technology in itself does not provide a whole and meaningful life. In fact, technological conditions are often detrimental to the good life. Integrity in life, according to him, is to be sought among things and practices that evade technoscientific objectification and commodification. Larry Hickman s Deweyan philosophy of technology is the third approach under scrutiny. Central in Hickman s thinking is a broad definition of technology that is nearly equal to Deweyan inquiry. Inquiry refers to the reflective and experiential way humans adapt to their environment by modifying their habits and beliefs. In Hickman s work, technology consists of all kinds of activities that through experimentation and/or reflection aim at improving human techniques and habits. Thus, in addition to research and development, many arts and political reforms are technological for Hickman. He argues for recasting such distinctions as fact/value, poiesis/praxis/theoria, and individual/society. Finally, Hickman does not admit a categorical difference between ethics and technology: moral values and norms need to be submitted to experiential inquiry as well as all the other notions. This study mainly argues for an interdisciplinary approach to the ethics of technology. This approach should make use of the potentialities of the research traditions in applied ethics, the philosophy of technology, and the social studies on science and technology and attempt to overcome their limitations. This study also advocates an endorsement of mid-level ethics that concentrate on the practices, institutions, and policies of temporal human life. Mid-level describes the realm between the instantaneous and individualistic micro-level and the universal and global macro level.