830 resultados para Gradient-based approaches


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lexicon-based approaches to Twitter sentiment analysis are gaining much popularity due to their simplicity, domain independence, and relatively good performance. These approaches rely on sentiment lexicons, where a collection of words are marked with fixed sentiment polarities. However, words' sentiment orientation (positive, neural, negative) and/or sentiment strengths could change depending on context and targeted entities. In this paper we present SentiCircle; a novel lexicon-based approach that takes into account the contextual and conceptual semantics of words when calculating their sentiment orientation and strength in Twitter. We evaluate our approach on three Twitter datasets using three different sentiment lexicons. Results show that our approach significantly outperforms two lexicon baselines. Results are competitive but inconclusive when comparing to state-of-art SentiStrength, and vary from one dataset to another. SentiCircle outperforms SentiStrength in accuracy on average, but falls marginally behind in F-measure. © 2014 Springer International Publishing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Learning to Research Researching to Learn explores the integration of research into teaching and learning at all levels of higher education. The chapters draw on the long and ongoing debate about the teaching research nexus in universities. Although the vast majority of academics believe that there is an important and valuable link between teaching and research, the precise nature of this relationship continues to be contested. The book includes chapters that showcase innovative ways of learning to research; how research is integrated into coursework teaching; how students learn the processes of research, and how universities are preparing students to engage with the world. The chapters also showcase innovative ways of researching to learn, exploring how students learn through doing research, how they conceptualise the knowledge of their fields of study through the processes of doing research, and how students experiment and reflect on the results produced. These are the key issues addressed by this anthology, as it brings together analyses of the ways in which university teachers are developing research skills in their students, creating enquiry-based approaches to teaching, and engaging in education research themselves. The studies here explore the links between teaching, learning and research in a range of contexts, from pre-enrolment through to academic staff development, in Australia, the UK, the US, Singapore and Denmark. Through a rich array of theoretical and methodological approaches, the collection seeks to further our understanding of how universities can play an effective role in educating graduates suited to the twenty-first century

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose that key concepts from clinical psychotherapy can inform science-based initiatives aimed at building tolerance and community cohesion. Commonalities in social and clinical psychology are identified regarding (1) distorted thinking (intergroup bias and cognitive bias), (2) stress and coping (at intergroup level and intrapersonal level), and (3) anxiety (intergroup anxiety and pathological anxiety). On this basis we introduce a new cognitive-behavioral model of social change. Mental imagery is the conceptual point of synthesis, and anxiety is at the core, through which new treatment-based approaches to reducing prejudice can be developed. More generally, we argue that this integration is illustrative of broader potential for cross-disciplinary integration in the social and clinical sciences, and has the potential to open up new possibilities and opportunities for both disciplines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Phospholipid oxidation can generate reactive and electrophilic products that are capable of modifying proteins, especially at cysteine, lysine and histidine residues. Such lipoxidation reactions are known to alter protein structure and function, both with gain of function and loss of activity effects. As well as potential importance in the redox regulation of cell behaviour, lipoxidation products in plasma could also be useful biomarkers for stress conditions. Although studies with antibodies suggested the occurrence of lipoxidation adducts on ApoB-100, these products had not previously been characterized at a molecular level. We have developed new mass spectrometry-based approaches to detect and locate adducts of oxidized phospholipids in plasma proteins, as well as direct oxidation modifications of proteins, which avoid some of the problems typically encountered with database search engines leading to erroneous identifications of oxidative PTMs. This approach uses accurate mass extracted ion chromatograms (XICs) of fragment ions from peptides containing oxPTMs, and allows multiple modifications to be examined regardless of the protein that contains them. For example, a reporter ion at 184.074 Da/e corresponding to phosphocholine indicated the presence of oxidized phosphatidylcholine adducts, while 2 reporter ions at 100.078 and 82.025 Da/e were selective for allysine. ApoB-100-oxidized phospholipid adducts were detected even in healthy human samples, as well as LDL from patients with inflammatory disease. Lipidomic studies showed that more than 350 different species of lipid were present in LDL, and were altered in disease conditions. LDL clearly represents a very complex carrier system and one that offers a rich source of information about systemic conditions, with potential as indicators of oxidative damage in ageing or inflammatory diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Astrocytes are now increasingly acknowledged as having fundamental and sophisticated roles in brain function and dysfunction. Unravelling the complex mechanisms that underlie human brain astrocyte-neuron interactions is therefore an essential step on the way to understanding how the brain operates. Insights into astrocyte function to date, have almost exclusively been derived from studies conducted using murine or rodent models. Whilst these have led to significant discoveries, preliminary work with human astrocytes has revealed a hitherto unknown range of astrocyte types with potentially greater functional complexity and increased neuronal interaction with respect to animal astrocytes. It is becoming apparent, therefore, that many important functions of astrocytes will only be discovered by direct physiological interrogation of human astrocytes. Recent advancements in the field of stem cell biology have provided a source of human based models. These will provide a platform to facilitate our understanding of normal astrocyte functions as well as their role in CNS pathology. A number of recent studies have demonstrated that stem cell derived astrocytes exhibit a range of properties, suggesting that they may be functionally equivalent to their in vivo counterparts. Further validation against in vivo models will ultimately confirm the future utility of these stem-cell based approaches in fulfilling the need for human- based cellular models for basic and clinical research. In this review we discuss the roles of astrocytes in the brain and highlight the extent to which human stem cell derived astrocytes have demonstrated functional activities that are equivalent to that observed in vivo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The treatment of presbyopia has been the focus of much scientific and clinical research over recent years, not least due to an increasingly aging population but also the desire for spectacle independence. Many lens and nonlens-based approaches have been investigated, and with advances in biomaterials and improved surgical methods, removable corneal inlays have been developed. One such development is the KAMRA™ inlay where a small entrance pupil is exploited to create a pinhole-type effect that increases the depth of focus and enables improvement in near visual acuity. Short- and long-term clinical studies have all reported significant improvement in near and intermediate vision compared to preoperative measures following monocular implantation (nondominant eye), with a large proportion of patients achieving Jaeger (J) 2 to J1 (~0.00 logMAR to ~0.10 logMAR) at the final follow-up. Although distance acuity is reduced slightly in the treated eye, binocular visual acuity and function remain very good (mean 0.10 logMAR or better). The safety of the inlay is well established and easily removable, and although some patients have developed corneal changes, these are clinically insignificant and the incidence appears to reduce markedly with advancements in KAMRA design, implantation technique, and femtosecond laser technology. This review aims to summarize the currently published peer-reviewed studies on the safety and efficacy of the KAMRA inlay and discusses the surgical and clinical outcomes with respect to the patient’s visual function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a novel template matching approach for the discrimination of handwritten and machine-printed text. We first pre-process the scanned document images by performing denoising, circles/lines exclusion and word-block level segmentation. We then align and match characters in a flexible sized gallery with the segmented regions, using parallelised normalised cross-correlation. The experimental results over the Pattern Recognition & Image Analysis Research Lab-Natural History Museum (PRImA-NHM) dataset show remarkably high robustness of the algorithm in classifying cluttered, occluded and noisy samples, in addition to those with significant high missing data. The algorithm, which gives 84.0% classification rate with false positive rate 0.16 over the dataset, does not require training samples and generates compelling results as opposed to the training-based approaches, which have used the same benchmark.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A tanulmány a kockázatnak és a kockázatok felmérésének az éves beszámolók (pénzügyi kimutatások) könyvvizsgálatban betöltött szerepével foglalkozik. A modern könyvvizsgálat – belső és külső korlátainál fogva – nem létezhet a vizsgált vállalkozás üzleti kockázatainak felmérése nélkül. Olyannyira igaz ez, hogy a szakma alapvető szabályait lefektető nemzeti és nemzetközi standardok is kötelező jelleggel előírják az ügyfelek üzleti kockázatainak megismerését. Mindez nem öncélú tevékenység, hanem éppen ez jelenti a könyvvizsgálat kiinduló magját: a kockázatbecslés – a tervezés részeként – az audit végrehajtásának alapja, és egyben vezérfonala. A szerző először bemutatja a könyvvizsgálat és a kockázat kapcsolatának alapvonásait, azt, hogy miként jelenik meg egyáltalán a kockázat problémája a könyvvizsgálatban. Ezt követően a különféle kockázatalapú megközelítéseket tárgyalja, majd néhány főbb elem kiragadásával ábrázolja a kockázatkoncepció beágyazódását a szakmai szabályozásba. Végül – mintegy az elmélet tesztjeként – bemutatja a kockázatmodell gyakorlati alkalmazásának néhány aspektusát. ______ The study examines the role of risk and the assessment of risks in the external audit of financial statements. A modern audit – due to its internal and external limitations – cannot exist without the assessment of the business risk of the entity being audited. This is not a l’art pour l’art activity but rather the very core of the audit. It is – as part of the planning of the audit – a guideline to the whole auditing process. This study has three main sections. The first one explains the connection between audit and risk, the second discusses the different risk based approaches to auditing and the embeddedness of the risk concept into professional regulation. Finally – as a test of theory – some practical aspects of the risk model are discussed through the lens of former empirical research carried out mostly in the US. The conclusion of the study is that though risk based models of auditing have many weaknesses they still result in the most effective and efficient high quality audits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The North Atlantic Treaty Organization (NATO) is a product of the Cold War through which its members organized their military forces for the purpose of collective defense against the common threat of Soviet-backed aggression. Employing the terminology of regime theory, the creation of NATO can be viewed as the introduction of an international security regime. Throughout the Cold War, NATO member states preserved their commitment to mutual defense while increasingly engaging in activities aimed at overcoming the division of Europe and promoting regional stability. The end of the Cold War has served as the catalyst for a new period of regime change as the Alliance introduced elements of a collective security regime by expanding its mandate to address new security challenges and reorganizing both its political and military organizational structures. ^ This research involves an interpretive analysis of NATO's evolution applying ideal theoretical constructs associated with distinct approaches to regime analysis. The process of regime change is investigated over several periods throughout the history of the Alliance in an effort to understand the Alliance's changing commitment to collective security. This research involves a review of regime theory literature, consisting of an examination of primary source documentation, including official documents and treaties, as well as a review of numerous secondary sources. This review is organized around a typology of power-based, organization-based, and norm-based approaches to regime analysis. This dissertation argues that the process of regime change within NATO is best understood by examining factors associated with multiple theoretical constructs. Relevant factors provide insights into the practice of collective security among NATO member states within Europe, while accounting for the inability of the NATO allies to build on the experience gained within Europe to play a more central role in operations outside of this region. This research contributes to a greater understanding of the nature of international regimes and the process of regime change, while offering recommendations aimed at increasing NATO's viability as a source of greater security and more meaningful international cooperation.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The great interest in nonlinear system identification is mainly due to the fact that a large amount of real systems are complex and need to have their nonlinearities considered so that their models can be successfully used in applications of control, prediction, inference, among others. This work evaluates the application of Fuzzy Wavelet Neural Networks (FWNN) to identify nonlinear dynamical systems subjected to noise and outliers. Generally, these elements cause negative effects on the identification procedure, resulting in erroneous interpretations regarding the dynamical behavior of the system. The FWNN combines in a single structure the ability to deal with uncertainties of fuzzy logic, the multiresolution characteristics of wavelet theory and learning and generalization abilities of the artificial neural networks. Usually, the learning procedure of these neural networks is realized by a gradient based method, which uses the mean squared error as its cost function. This work proposes the replacement of this traditional function by an Information Theoretic Learning similarity measure, called correntropy. With the use of this similarity measure, higher order statistics can be considered during the FWNN training process. For this reason, this measure is more suitable for non-Gaussian error distributions and makes the training less sensitive to the presence of outliers. In order to evaluate this replacement, FWNN models are obtained in two identification case studies: a real nonlinear system, consisting of a multisection tank, and a simulated system based on a model of the human knee joint. The results demonstrate that the application of correntropy as the error backpropagation algorithm cost function makes the identification procedure using FWNN models more robust to outliers. However, this is only achieved if the gaussian kernel width of correntropy is properly adjusted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data integration systems offer uniform access to a set of autonomous and heterogeneous data sources. One of the main challenges in data integration is reconciling semantic differences among data sources. Approaches that been used to solve this problem can be categorized as schema-based and attribute-based. Schema-based approaches use schema information to identify the semantic similarity in data; furthermore, they focus on reconciling types before reconciling attributes. In contrast, attribute-based approaches use statistical and structural information of attributes to identify the semantic similarity of data in different sources. This research examines an approach to semantic reconciliation based on integrating properties expressed at different levels of abstraction or granularity using the concept of property precedence. Property precedence reconciles the meaning of attributes by identifying similarities between attributes based on what these attributes represent in the real world. In order to use property precedence for semantic integration, we need to identify the precedence of attributes within and across data sources. The goal of this research is to develop and evaluate a method and algorithms that will identify precedence relations among attributes and build property precedence graph (PPG) that can be used to support integration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research to explore the use of modelling in the field of Purchasing and Supply Management (P/SM). We are particularly interested in identifying the specific areas of P/SM where there are opportunities for the use of modelling based methods. The paper starts with an overview of main types of modelling and also provides a categorisation of the main P/SM research themes. Our research shows that there are many opportunities for using descriptive, predictive and prescriptive modelling approaches in all areas of P/SM research from the ones with a focus on the actual function from a purely operational and execution perspective (e.g. purchasing processes and behaviour) to the ones with a focus on the organisational level from a more strategic perspective (e.g. strategy and policy). We conclude that future P/SM research needs to explore the value of modelling not just at the functional or operational level, but also at the organisation and strategic level respectively. We also acknowledge that while using empirical results to inform and improve models has advantages, there are also drawbacks, which relate to the value, the practical relevance and the generalisability of the modelling based approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

© 2016, Springer-Verlag Berlin Heidelberg.Nanoparticles are being explored in many different applications due to the unique properties offered by quantum effects. To broaden the scope of these applications, the deposition of nanoparticles onto substrates in a simple and controlled way is highly desired. In this study, we use resonant infrared matrix-assisted pulsed laser evaporation (RIR-MAPLE) for the deposition of metallic, silver nanoparticles for plasmonic applications. We find that RIR-MAPLE, a simple and versatile approach, is able to deposit silver nanoparticles as large as 80 nm onto different substrates with good adhesion, regardless of substrate properties. In addition, the nanoparticle surface coverage of the substrates, which result from the random distribution of nanoparticles across the substrate per laser pulse, can be simply and precisely controlled by RIR-MAPLE. Polymer films of poly(3-hexylthiophene-2,5-diyl) (P3HT) are also deposited by RIR-MAPLE on top of the deposited silver nanoparticles in order to demonstrate enhanced absorption due to the localized surface plasmon resonance effect. The reported features of RIR-MAPLE nanoparticle deposition indicate that this tool can enable efficient processing of nanoparticle thin films for applications that require specific substrates or configurations that are not easily achieved using solution-based approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High throughput next generation sequencing, together with advanced molecular methods, has considerably enhanced the field of food microbiology. By overcoming biases associated with culture dependant approaches, it has become possible to achieve novel insights into the nature of food-borne microbial communities. In this thesis, several different sequencing-based approaches were applied with a view to better understanding microbe associated quality defects in cheese. Initially, a literature review provides an overview of microbe-associated cheese quality defects as well as molecular methods for profiling complex microbial communities. Following this, 16S rRNA sequencing revealed temporal and spatial differences in microbial composition due to the time during the production day that specific commercial cheeses were manufactured. A novel Ion PGM sequencing approach, focusing on decarboxylase genes rather than 16S rRNA genes, was then successfully employed to profile the biogenic amine producing cohort of a series of artisanal cheeses. Investigations into the phenomenon of cheese pinking formed the basis of a joint 16S rRNA and whole genome shotgun sequencing approach, leading to the identification of Thermus species and, more specifically, the pathway involved in production of lycopene, a red coloured carotenoid. Finally, using a more traditional approach, the effect of addition of a facultatively heterofermentative Lactobacillus (Lactobacillus casei) to a Swiss-type cheese, in which starter activity was compromised, was investigated from the perspective of its ability to promote gas defects and irregular eye formation. X-ray computed tomography was used to visualise, using a non-destructive method, the consequences of the undesirable gas formation that resulted. Ultimately this thesis has demonstrated that the application of molecular techniques, such as next generation sequencing, can provide a detailed insight into defect-causing microbial populations present and thereby may underpin approaches to optimise the quality and consistency of a wide variety of cheeses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este artículo sugiere un enfoque nuevo a la enseñanza de las dos estructuras gramaticales la pasiva refleja y el “se” impersonal para las clases universitarias de E/LE. Concretamente, se argumenta que las dos se deberían tratar como construcciones pasivas, basada en un análisis léxico-funcional de ellas que enfoca la lingüística contrastiva. Incluso para la instrucción de E/LE, se recomienda una aproximación contrastiva en la que se enfocan tanto la reflexión metalingüística como la competencia del estudiante en el L2. Específicamente, el uso de córpora lingüísticos en la clase forma una parte integral de la instrucción. El uso de un corpus estimula la curiosidad del estudiante, le expone a material de lengua auténtica, y promulga la reflexión inductiva independiente.