915 resultados para state-space methods
Resumo:
Background: The majority of studies investigating the neural mechanisms underlying treatment in people with aphasia have examined task-based brain activity. However, the use of resting-state fMRI may provide another method of examining the brain mechanisms responsible for treatment-induced recovery, and allows for investigation into connectivity within complex functional networks Methods: Eight people with aphasia underwent 12 treatment sessions that aimed to improve object naming. Half the sessions employed a phonologically-based task, and half the sessions employed a semantic-based task, with resting-state fMRI conducted pre- and post-treatment. Brain regions in which the amplitude of low frequency fluctuations (ALFF) correlated with treatment outcomes were used as seeds for functional connectivity (FC) analysis. FC maps were compared from pre- to post-treatment, as well as with a group of 12 healthy older controls Results: Pre-treatment ALFF in the right middle temporal gyrus (MTG) correlated with greater outcomes for the phonological treatment, with a shift to the left MTG and supramarginal gyrus, as well as the right inferior frontal gyrus, post-treatment. When compared to controls, participants with aphasia showed both normalization and up-regulation of connectivity within language networks post-treatment, predominantly in the left hemisphere Conclusions: The results provide preliminary evidence that treatments for naming impairments affect the FC of language networks, and may aid in understanding the neural mechanisms underlying the rehabilitation of language post-stroke.
Resumo:
Several genetic variants are thought to influence white matter (WM) integrity, measured with diffusion tensor imaging (DTI). Voxel based methods can test genetic associations, but heavy multiple comparisons corrections are required to adjust for searching the whole brain and for all genetic variants analyzed. Thus, genetic associations are hard to detect even in large studies. Using a recently developed multi-SNP analysis, we examined the joint predictive power of a group of 18 cholesterol-related single nucleotide polymorphisms (SNPs) on WM integrity, measured by fractional anisotropy. To boost power, we limited the analysis to brain voxels that showed significant associations with total serum cholesterol levels. From this space, we identified two genes with effects that replicated in individual voxel-wise analyses of the whole brain. Multivariate analyses of genetic variants on a reduced anatomical search space may help to identify SNPs with strongest effects on the brain from a broad panel of genes.
Resumo:
In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.
Resumo:
Lateralization of temporal lobe epilepsy (TLE) is critical for successful outcome of surgery to relieve seizures. TLE affects brain regions beyond the temporal lobes and has been associated with aberrant brain networks, based on evidence from functional magnetic resonance imaging. We present here a machine learning-based method for determining the laterality of TLE, using features extracted from resting-state functional connectivity of the brain. A comprehensive feature space was constructed to include network properties within local brain regions, between brain regions, and across the whole network. Feature selection was performed based on random forest and a support vector machine was employed to train a linear model to predict the laterality of TLE on unseen patients. A leave-one-patient-out cross validation was carried out on 12 patients and a prediction accuracy of 83% was achieved. The importance of selected features was analyzed to demonstrate the contribution of resting-state connectivity attributes at voxel, region, and network levels to TLE lateralization.
Resumo:
This project was a step forward in applying statistical methods and models to provide new insights for more informed decision-making at large spatial scales. The model has been designed to address complicated effects of ecological processes that govern the state of populations and uncertainties inherent in large spatio-temporal datasets. Specifically, the thesis contributes to better understanding and management of the Great Barrier Reef.
Resumo:
Governments and intergovernmental organisations have long recognised that space communities – the ultimate ‘settlements at the edge’ – will exist one day and have based their first plans for these on another region ‘at the edge’, the Antarctic. United States President Eisenhower proposed to the United Nations in 1960 that the principles of the Antarctic Treaty be applied to outer space and celestial bodies (State Department, n.d.). Three years later the UN adopted the Declaration of Legal Principles Governing the Activities of States in the Exploration and Use of Outer Space and in 1967 that became the Outer Space Treaty. According to the UN Office for Outer Space Affairs, ‘the Treaty was opened for signature by the three depository Governments (the Russian Federation, the United Kingdom and the United States of America) in January 1967, and it entered into force in October 1967’ (Office for Outer Space Affairs, n.d). The status of the treaty (at time of writing) was 89 signatories and 102 parties (Office for Disarmament Affairs, n.d.). Other related instruments include the Rescue Agreement, the Liability Convention, the Registration Convention and the Moon Agreement (Office for Outer Space Affairs, n.d.-a). Jumping to the present, a newsagency reported in July 2014 (Reuters, 2014) that the British Government had shortlisted eight aerodromes in its search for a potential base for the UK’s first spaceplane flights which Ministers want to happen by 2018 (UK Space Agency, 2014). The United States already has a spaceport, in New Mexico (Cokley, Rankin, Heinrich, & McAuliffe, 2013)...
Resumo:
Interaction of tetrathiafulvalene (TTF) and tetracyanoethylene (TCNE) with few-layer graphene samples prepared by the exfoliation of graphite oxide (EG), conversion of nanodiamond (DG) and arc-evaporation of graphite in hydrogen (HG) has been investigated by Raman spectroscopy to understand the role of the graphene surface. The position and full-width at half maximum of the Raman G-band are affected on interaction with TTF and TCNE and the effect is highest with EG and least with HG. The effect of TTF and TCNE on the 2D-band is also maximum with EG. The magnitude of interaction between the donor/acceptor molecules varies in the same order as the surface areas of the graphenes. (C) 2009 Published by Elsevier B. V.
Resumo:
l-Lysine acetate crystallises in the monoclinic space group P21 with a = 5.411 (1), b = 7.562(1), c= l2.635(2) Å and β = 91.7(1). The crystal structure was solved by direct methods and refined to an R value of 0.049 using the full matrix least squares method. The conformation and the aggregation of lysine molecules in the structure are similar to those found in the crystal structure of l-lysine l-aspartate. A conspicuous similarity between the crystal structures of l-arginine acetate and l-lysine acetate is that in both cases the strongly basic side chain, although having the largest pK value, interacts with the weakly acidic acetate group leaving the α-amino and the α-carboxylate groups to take part in head-to-tail sequences. These structures thus indicate that electrostatic effects are strongly modulated by other factors so as to give rise to head-to-tail sequences which have earlier been shown to be an almost universal feature of amino acid aggregation in the solid state.
Resumo:
Vermicular graphite cast iron is a new addition to the family of cast irons. Various methods for producing vermicular graphite cast iron are briefly discussed in this paper. The mechanical and physical properties of cast irons with vermicular graphite have been found to be intermediate between those of gray and ductile irons. Other properties such as casting characteristics, scaling resistance, damping capacity and machinability have been compared with those of gray and ductile irons. Probable applications of vermicular graphite cast irons are suggested.
Resumo:
Many active pharmaceutical ingredients (APIs) have both anhydrate and hydrate forms. Due to the different physicochemical properties of solid forms, the changes in solid-state may result in therapeutic, pharmaceutical, legal and commercial problems. In order to obtain good solid dosage form quality and performance, there is a constant need to understand and control these phase transitions during manufacturing and storage. Thus it is important to detect and also quantify the possible transitions between the different forms. In recent years, vibrational spectroscopy has become an increasingly popular tool to characterise the solid-state forms and their phase transitions. It offers several advantages over other characterisation techniques including an ability to obtain molecular level information, minimal sample preparation, and the possibility of monitoring changes non-destructively in-line. Dehydration is the phase transition of hydrates which is frequently encountered during the dosage form production and storage. The aim of the present thesis was to investigate the dehydration behaviour of diverse pharmaceutical hydrates by near infrared (NIR), Raman and terahertz pulsed spectroscopic (TPS) monitoring together with multivariate data analysis. The goal was to reveal new perspectives for investigation of the dehydration at the molecular level. Solid-state transformations were monitored during dehydration of diverse hydrates on hot-stage. The results obtained from qualitative experiments were used to develop a method and perform the quantification of the solid-state forms during process induced dehydration in a fluidised bed dryer. Both in situ and in-line process monitoring and quantification was performed. This thesis demonstrated the utility of vibrational spectroscopy techniques and multivariate modelling to monitor and investigate dehydration behaviour in situ and during fluidised bed drying. All three spectroscopic methods proved complementary in the study of dehydration. NIR spectroscopy models could quantify the solid-state forms in the binary system, but were unable to quantify all the forms in the quaternary system. Raman spectroscopy models on the other hand could quantify all four solid-state forms that appeared upon isothermal dehydration. The speed of spectroscopic methods makes them applicable for monitoring dehydration and the quantification of multiple forms was performed during phase transition. Thus the solid-state structure information at the molecular level was directly obtained. TPS detected the intermolecular phonon modes and Raman spectroscopy detected mostly the changes in intramolecular vibrations. Both techniques revealed information about the crystal structure changes. NIR spectroscopy, on the other hand was more sensitive to water content and hydrogen bonding environment of water molecules. This study provides a basis for real time process monitoring using vibrational spectroscopy during pharmaceutical manufacturing.
Resumo:
This dissertation is a theoretical study of finite-state based grammars used in natural language processing. The study is concerned with certain varieties of finite-state intersection grammars (FSIG) whose parsers define regular relations between surface strings and annotated surface strings. The study focuses on the following three aspects of FSIGs: (i) Computational complexity of grammars under limiting parameters In the study, the computational complexity in practical natural language processing is approached through performance-motivated parameters on structural complexity. Each parameter splits some grammars in the Chomsky hierarchy into an infinite set of subset approximations. When the approximations are regular, they seem to fall into the logarithmic-time hierarchyand the dot-depth hierarchy of star-free regular languages. This theoretical result is important and possibly relevant to grammar induction. (ii) Linguistically applicable structural representations Related to the linguistically applicable representations of syntactic entities, the study contains new bracketing schemes that cope with dependency links, left- and right branching, crossing dependencies and spurious ambiguity. New grammar representations that resemble the Chomsky-Schützenberger representation of context-free languages are presented in the study, and they include, in particular, representations for mildly context-sensitive non-projective dependency grammars whose performance-motivated approximations are linear time parseable. (iii) Compilation and simplification of linguistic constraints Efficient compilation methods for certain regular operations such as generalized restriction are presented. These include an elegant algorithm that has already been adopted as the approach in a proprietary finite-state tool. In addition to the compilation methods, an approach to on-the-fly simplifications of finite-state representations for parse forests is sketched. These findings are tightly coupled with each other under the theme of locality. I argue that the findings help us to develop better, linguistically oriented formalisms for finite-state parsing and to develop more efficient parsers for natural language processing. Avainsanat: syntactic parsing, finite-state automata, dependency grammar, first-order logic, linguistic performance, star-free regular approximations, mildly context-sensitive grammars
Resumo:
National anniversaries such as independence days demand precise coordination in order to make citizens change their routines to forego work and spend the day at rest or at festivities that provide social focus and spectacle. The complex social construction of national days is taken for granted and operates as a given in the news media, which are the main agents responsible for coordinating these planned disruptions of normal routines. This study examines the language used in the news to construct the rather unnatural idea of national days and to align people in observing them. The data for the study consist of news stories about the Fourth of July in the New York Times, sampled over 150 years and are supplemented by material from other sources and other countries. The study is multidimensional, applying concepts from pragmatics (speech acts, politeness, information structure), systemic functional linguistics (the interpersonal metafunction and the Appraisal framework) and cognitive linguistics (frames, metaphor) as well as journalism and communications to arrive at an interdisciplinary understanding of how resources for meaning are used by writers and readers of the news stories. The analysis shows that on national anniversaries, nations tend to be metaphorized as persons having birthdays, to whom politeness should be shown. The face of the nation is to be respected in the sense of identifying the nation's interests as one's own (positive face) and speaking of citizen responsibilities rather than rights (negative face). Resources are available for both positive and negative evaluations of events and participants and the newspaper deftly changes footings (Goffman 1981) to demonstrate the required politeness while also heteroglossically allowing for a certain amount of disattention and even protest - within limits, for state holidays are almost never construed as Bakhtinian festivals, as they tend to reaffirm the hierarchy rather than invert it. Celebrations are evaluated mainly for impressiveness, and for the essentially contested quality of appropriateness, which covers norms of predictability, size, audience response, aesthetics, and explicit reference to the past. Events may also be negatively evaluated as dull ("banal") or inauthentic ("hoopla"). Audiences are evaluated chiefly in terms of their enthusiasm, or production of appropriate displays for emotional response, for national days are supposed to be occasions of flooding-out of nationalistic feeling. By making these evaluations, the newspaper reinforces its powerful position as an independent critic, while at the same time playing an active role in the construction and reproduction of emotional order embodied in "the nation's birthday." As an occasion for mobilization and demonstrations of power, national days may be seen to stand to war in the relation of play to fighting (Bateson 1955). Evidence from the newspaper's coverage of recent conflicts is adduced to support this analysis. In the course of the investigation, methods are developed for analyzing large collections of newspaper content, particularly topical soft news and feature materials that have hitherto been considered less influential and worthy of study than so-called hard news. In his work on evaluation in newspaper stories, White (1998) proposed that the classic hard news story is focused on an event that threatens the social order, but news of holidays and celebrations in general does not fit this pattern, in fact its central event is a reproduction of the social order. Thus in the system of news values (Galtung and Ruge 1965), national holiday news draws on "ground" news values such as continuity and predictability rather than "figure" news values such as negativity and surprise. It is argued that this ground helps form a necessary space for hard news to be seen as important, similar to the way in which the information structure of language is seen to rely on the regular alternation of given and new information (Chafe 1994).
Resumo:
Domain-invariant representations are key to addressing the domain shift problem where the training and test exam- ples follow different distributions. Existing techniques that have attempted to match the distributions of the source and target domains typically compare these distributions in the original feature space. This space, however, may not be di- rectly suitable for such a comparison, since some of the fea- tures may have been distorted by the domain shift, or may be domain specific. In this paper, we introduce a Domain Invariant Projection approach: An unsupervised domain adaptation method that overcomes this issue by extracting the information that is invariant across the source and tar- get domains. More specifically, we learn a projection of the data to a low-dimensional latent space where the distance between the empirical distributions of the source and target examples is minimized. We demonstrate the effectiveness of our approach on the task of visual object recognition and show that it outperforms state-of-the-art methods on a stan- dard domain adaptation benchmark dataset
Resumo:
Crystal and molecular structure of a compound 4-cyanobiphenyl-4'-heptylbiphenyl carboxylate (7CBB), which exhibit both monolayer smectic A and nematic phases, have been determined by direct methods using single crystal X-ray diffraction data. The structure is monoclinic with the space group P21/c and Z = 4. The unit cell parameters are a = 16.9550(5) Aring, b = 5.5912(18) Aring, c = 27.5390(9) Aring, agr = 90.000°, β = 93.986(6)°, and γ = 90.000°. Packing of the molecules is found to be precursor to SmC phase, although SmA1 phase is observed on melting. Several strong van der Waals interactions are observed in the core part of the neighboring molecular pairs. Crystal to mesophase transition is probably of reconstitutive nature. Geometry, packing, and nature of crystal-mesophase transition are compared to those in 6CBB.
Resumo:
Between-subject and within-subject variability is ubiquitous in biology and physiology and understanding and dealing with this is one of the biggest challenges in medicine. At the same time it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler-Reuter cardiac electrophysiological model. We show improved efficiency via SMC and that it produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block.