54 resultados para Hierarchy of text classifiers


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Psychotic phenomena appear to form a continuum with normal experience and beliefs, and may build on common emotional interpersonal concerns. Aims: We tested predictions that paranoid ideation is exponentially distributed and hierarchically arranged in the general population, and that persecutory ideas build on more common cognitions of mistrust, interpersonal sensitivity and ideas of reference. Method: Items were chosen from the Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II) questionnaire and the Psychosis Screening Questionnaire in the second British National Survey of Psychiatric Morbidity (n = 8580), to test a putative hierarchy of paranoid development using confirmatory factor analysis, latent class analysis and factor mixture modelling analysis. Results: Different types of paranoid ideation ranged in frequency from less than 2% to nearly 30%. Total scores on these items followed an almost perfect exponential distribution (r = 0.99). Our four a priori first-order factors were corroborated (interpersonal sensitivity; mistrust; ideas of reference; ideas of persecution). These mapped onto four classes of individual respondents: a rare, severe, persecutory class with high endorsement of all item factors, including persecutory ideation; a quasi-normal class with infrequent endorsement of interpersonal sensitivity, mistrust and ideas of reference, and no ideas of persecution; and two intermediate classes, characterised respectively by relatively high endorsement of items relating to mistrust and to ideas of reference. Conclusions: The paranoia continuum has implications for the aetiology, mechanisms and treatment of psychotic disorders, while confirming the lack of a clear distinction from normal experiences and processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FAMOUS fills an important role in the hierarchy of climate models, both explicitly resolving atmospheric and oceanic dynamics yet being sufficiently computationally efficient that either very long simulations or large ensembles are possible. An improved set of carbon cycle parameters for this model has been found using a perturbed physics ensemble technique. This is an important step towards building the "Earth System" modelling capability of FAMOUS, which is a reduced resolution, and hence faster running, version of the Hadley Centre Climate model, HadCM3. Two separate 100 member perturbed parameter ensembles were performed; one for the land surface and one for the ocean. The land surface scheme was tested against present-day and past representations of vegetation and the ocean ensemble was tested against observations of nitrate. An advantage of using a relatively fast climate model is that a large number of simulations can be run and hence the model parameter space (a large source of climate model uncertainty) can be more thoroughly sampled. This has the associated benefit of being able to assess the sensitivity of model results to changes in each parameter. The climatologies of surface and tropospheric air temperature and precipitation are improved relative to previous versions of FAMOUS. The improved representation of upper atmosphere temperatures is driven by improved ozone concentrations near the tropopause and better upper level winds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a mathematical model describing the inward solidification of a slab, a circular cylinder and a sphere of binary melt kept below its equilibrium freezing temperature. The thermal and physical properties of the melt and solid are assumed to be identical. An asymptotic method, valid in the limit of large Stefan number is used to decompose the moving boundary problem for a pure substance into a hierarchy of fixed-domain diffusion problems. Approximate, analytical solutions are derived for the inward solidification of a slab and a sphere of a binary melt which are compared with numerical solutions of the unapproximated system. The solutions are found to agree within the appropriate asymptotic regime of large Stefan number and small time. Numerical solutions are used to demonstrate the dependence of the solidification process upon the level of impurity and other parameters. We conclude with a discussion of the solutions obtained, their stability and possible extensions and refinements of our study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Research in aphasia has focused on acquired dyslexias at the single word level, with a paucity of assessment techniques and rehabilitation approaches for individuals with difficulty at the text level. A rich literature from research with paediatric populations and healthy non-brain damaged, skilled adult readers allows the component processes that are important for text reading to be defined and more appropriate assessments to be devised. Aims: To assess the component processes of text reading in a small group of individuals with aphasia who report difficulties reading at the text level. Do assessments of component processes in reading comprehension reveal distinct profiles of text comprehension? To what extent are text comprehension difficulties caused by underlying linguistic and/or cognitive deficits? Methods & Procedures: Four individuals with mild aphasia who reported difficulties reading at the text level took part in a case-series study. Published assessments were used to confirm the presence of text comprehension impairment. Participants completed a range of assessments to provide a profile of their linguistic and cognitive skills, focusing on processes known to be important for text comprehension. We identified the following areas for assessment: reading speed, language skills (single word and sentence), inferencing, working memory and metacognitive skills (monitoring and strategy use). Outcomes & Results: Performance was compared against age-matched adult control data. One participant presented with a trend for impaired abilities in inferencing, with all other assessed skills being within normal limits. The other three had identified linguistic and working memory difficulties. One presented with a residual deficit in accessing single word meaning that affected text comprehension. The other two showed no clear link between sentence processing difficulties and text comprehension impairments. Across these three, data suggested a link between verbal working memory capacity and specific inferencing skills. Conclusions: Successful text reading relies on a number of component processes. In this paper we have made a start in defining those component processes and devising tasks suitable to assess them. From our results, assessment of verbal working memory and inferencing appears to be critical for understanding text comprehension impairments in aphasia. It is possible that rehabilitation input can capitalize on key meta-cognitive skills (monitoring, strategy use) to support functional reading in the face of existing linguistic, text comprehension and memory impairments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware technologies allow to capture and process data in real-time and the resulting high throughput data streams require novel data mining approaches. The research area of Data Stream Mining (DSM) is developing data mining algorithms that allow us to analyse these continuous streams of data in real-time. The creation and real-time adaption of classification models from data streams is one of the most challenging DSM tasks. Current classifiers for streaming data address this problem by using incremental learning algorithms. However, even so these algorithms are fast, they are challenged by high velocity data streams, where data instances are incoming at a fast rate. This is problematic if the applications desire that there is no or only a very little delay between changes in the patterns of the stream and absorption of these patterns by the classifier. Problems of scalability to Big Data of traditional data mining algorithms for static (non streaming) datasets have been addressed through the development of parallel classifiers. However, there is very little work on the parallelisation of data stream classification techniques. In this paper we investigate K-Nearest Neighbours (KNN) as the basis for a real-time adaptive and parallel methodology for scalable data stream classification tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sclera segmentation is shown to be of significant importance for eye and iris biometrics. However, sclera segmentation has not been extensively researched as a separate topic, but mainly summarized as a component of a broader task. This paper proposes a novel sclera segmentation algorithm for colour images which operates at pixel-level. Exploring various colour spaces, the proposed approach is robust to image noise and different gaze directions. The algorithm’s robustness is enhanced by a two-stage classifier. At the first stage, a set of simple classifiers is employed, while at the second stage, a neural network classifier operates on the probabilities’ space generated by the classifiers at stage 1. The proposed method was ranked the 1st in Sclera Segmentation Benchmarking Competition 2015, part of BTAS 2015, with a precision of 95.05% corresponding to a recall of 94.56%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The strength of the Antarctic Circumpolar Current (ACC) is believed to depend on the westerly wind stress blowing over the Southern Ocean, although the exact relationship between winds and circumpolar transport is yet to be determined. Here we show, based on theoretical arguments and a hierarchy of numerical modeling experiments, that the global pycnocline depth and the baroclinic ACC transport are set by an integral measure of the wind stress over the path of the ACC, taking into account its northward deflection. Our results assume that the mesoscale eddy diffusivity is independent of the mean flow; while the relationship between wind stress and ACC transport will be more complicated in an eddy-saturated regime, our conclusion that the ACC is driven by winds over the circumpolar streamlines is likely to be robust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines two genres of text which were extremely popular in the late-medieval and early modern periods, and it pays particular attention to women users. The printed almanacs of sixteenth-century England were enormously influential; yet their contents are so formulaic and repetitive as to appear almost empty of valuable information. Their most striking feature is their astrological guidance for the reader, and this has led to them being considered 'merely' the repository of popular superstition. Only in the last decade have themes of gender and medicine been given serious consideration in relation to almanacs; but this work has focused on the seventeenth century. This chapter centres on a detailed analysis of sixteenth-century English almanacs, and the various kinds of scientific and household guidance they offered to women readers. Both compilers and users needed to chart a safe course through the religious and scientific battles of the time; and the complexities involved are demonstrated by considering the almanacs in relation to competing sources of guidance. These latter are Books of Hours and 'scientific' works such as medical calendars compiled by Oxford scholars in the late middle ages. A key feature of this chapter is that it gives practical interpretations of this complex information, for the guidance of modern readers unfamiliar with astrology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past 15 years, a number of initiatives have been undertaken at national level to develop ocean forecasting systems operating at regional and/or global scales. The co-ordination between these efforts has been organized internationally through the Global Ocean Data Assimilation Experiment (GODAE). The French MERCATOR project is one of the leading participants in GODAE. The MERCATOR systems routinely assimilate a variety of observations such as multi-satellite altimeter data, sea-surface temperature and in situ temperature and salinity profiles, focusing on high-resolution scales of the ocean dynamics. The assimilation strategy in MERCATOR is based on a hierarchy of methods of increasing sophistication including optimal interpolation, Kalman filtering and variational methods, which are progressively deployed through the Syst`eme d’Assimilation MERCATOR (SAM) series. SAM-1 is based on a reduced-order optimal interpolation which can be operated using ‘altimetry-only’ or ‘multi-data’ set-ups; it relies on the concept of separability, assuming that the correlations can be separated into a product of horizontal and vertical contributions. The second release, SAM-2, is being developed to include new features from the singular evolutive extended Kalman (SEEK) filter, such as three-dimensional, multivariate error modes and adaptivity schemes. The third one, SAM-3, considers variational methods such as the incremental four-dimensional variational algorithm. Most operational forecasting systems evaluated during GODAE are based on least-squares statistical estimation assuming Gaussian errors. In the framework of the EU MERSEA (Marine EnviRonment and Security for the European Area) project, research is being conducted to prepare the next-generation operational ocean monitoring and forecasting systems. The research effort will explore nonlinear assimilation formulations to overcome limitations of the current systems. This paper provides an overview of the developments conducted in MERSEA with the SEEK filter, the Ensemble Kalman filter and the sequential importance re-sampling filter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a framework architecture for the automated re-purposing and efficient delivery of multimedia content stored in CMSs. It deploys specifically designed templates as well as adaptation rules based on a hierarchy of profiles to accommodate user, device and network requirements invoked as constraints in the adaptation process. The user profile provides information in accordance with the opt-in principle, while the device and network profiles provide the operational constraints such as for example resolution and bandwidth limitations. The profiles hierarchy ensures that the adaptation privileges the users' preferences. As part of the adaptation, we took into account the support for users' special needs, and therefore adopted a template-based approach that could simplify the adaptation process integrating accessibility-by-design in the template.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The results from a range of different signal processing schemes used for the further processing of THz transients are contrasted. The performance of different classifiers after adopting these schemes are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typeface design: collaborative work commissioned by Adobe Inc. Published but unreleased. The Adobe Devanagari typefaces were commissioned from Tiro Typeworks and collaboratively designed by Tim Holloway, Fiona Ross and John Hudson, beginning in 2005. The types were officially released in 2009. The design brief was to produce a typeface for modern business communications in Hindi and other languages, to be legible both in print and on screen. Adobe Devanagari was designed to be highly readable in a range of situations including quite small sizes in spreadsheets and in continuous text setting, as well as at display sizes, where the full character of the typeface reveals itself. The construction of the letters is based on traditional penmanship but possesses less stroke contrast than many Devanagari types, in order to maintain strong, legible forms at smaller sizes. To achieve a dynamic, fluid style the design features a rounded treatment of distinguishing terminals and stroke reversals, open counters that also aid legibility at smaller sizes, and delicately flaring strokes. Together, these details reveal an original hand and provide a contemporary approach that is clean, clear and comfortable to read whether in short or long passages of text. This new approach to a traditional script is intended to counter the dominance of rigid, staccato-like effects of straight verticals and horizontals in earlier types and many existing fonts. OpenType Layout features in the fonts provide both automated and discretionary access to an extensive glyph set, enabling sophisticated typography. Many conjuncts preferred in classical literary texts and particularly in some North Indian languages are included; these literary conjuncts may be substituted by specially designed alternative linear forms and fitted half forms. The length of the ikars—ि and ी—varies automatically according to adjacent letter or conjunct width. Regional variants of characters and numerals (e.g. Marathi forms) are included as alternates. Careful attention has been given to the placements of all vowel signs and modifiers. The fonts include both proportional and tabular numerals in Indian and European styles. Extensive kerning covers several thousand possible combinations of half forms and full forms to anticipate arbitrary conjuncts in foreign loan words. _____

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article critically examines the nature and quality of governance in community representation and civil society engagement in the context of trans-national large-scale mining, drawing on experiences in the Anosy Region of south-east Madagascar. An exploration of functional relationships between government, mining business and civil society stakeholders reveals an equivocal legitimacy of certain civil society representatives, created by state manipulation, which contributes to community disempowerment. The appointment of local government officials, rather than election, creates a hierarchy of upward dependencies and a culture where the majority of officials express similar views and political alliances. As a consequence, community resistance is suppressed. Voluntary mechanisms such as Corporate Social Responsibility (CSR) and the Extractive Industries Transparency Initiative (EITI) advocate community stakeholder engagement in decision making processes as a measure to achieve public accountability. In many developing countries, where there is a lack of transparency and high levels of corruption, the value of this engagement, however, is debatable. Findings from this study indicate that the power relationships which exist between stakeholders in the highly lucrative mining industry override efforts to achieve "good governance" through voluntary community engagement. The continuing challenge lies in identifying where the responsibility sits in order to address this power struggle to achieve fair representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the year 1702 two books were published, in Oxford and Paris, that can now be seen as defining the presses that produced them. In Paris, the Imprimerie Royale issued the Médailles sur les principaux évènements du règne de Louis le Grand, a large folio of text and plates intended to glorify the regime of Louis XIV. In Oxford, the first, large format volume of Clarendon’s The history of the rebellion appeared; painstakingly edited at Christ Church, it brought prestige and profit to the University. Both were considerable statements of publishing intent in graphic form: both were sumptuous, and both used types and decorations reserved to their respective presses. But the French book points the way to future developments in typography, particularly in the design of type, while the Oxford book is a summation of the past, and its types and page design would be abandoned by the Oxford press in little more than thirty years. Tracing the printed pages of Oxford books from the late sixteenth to the mid-eighteenth century shows changes that parallel wider developments in English and European typography, but from a distinctly Oxford perspective.