859 resultados para REPRESENTATIONS OF PARTIALLY ORDERED SETS
Resumo:
The relationship and phylogeny of the western Palearctic harvestmen family Trogulidae is investigated. The traditional system of seven genera and approximately 40 species appeared to be artificially composed but a phylogenetic approach and a comprehensive revision has long been sought after. Species are poorly characterised due to their uniform morphology and species evaluation is furthermore complicated by the variability of the few characters used for species delineation. To meet these demands a molecular genetic analysis is accomplished using the nuclear 28S rRNA gene and the mitochondrial cytochrome b gene. This analysis incorporates most genera and species of Trogulidae as well as a comprehensive set of Nemastomatidae and Dicranolasmatidae as outgroup taxa. Phylogenetic results of Bayesian analysis, Maximum Parsimony, Maximum Likelihood and Neighbor Joining are compared with distributional data, morphological characters and results of canonical discriminant analysis of morphometric characters and general congruence of these data sets is shown. To demonstrate the applicability of this method the revision of two species-groups within Trogulus is set out in detail. The Trogulus hirtus species-group and the Trogulus coriziformis species-group are revised. The former is in the central and north-western Balkan Peninsula. T. tricarinatus ssp. hirtus is raised to species level and four new species are described (T. karamanorum [man.n.], T. melitensis [man.n.], T. pharensis [man.n]; T. thaleri [man.n.]). The Trogulus coriziformis species-group is confined to the western Mediterranean area. T. coriziformis, T. aquaticus are re-described, T. cristatus and T. lusitanicus are re-established and four species are described as new (T. balearicus, T. huberi, T. prietoi, T. pyrenaicus). In both species-groups two further cryptic species probably exist but were not described. The species groups are shown to represent different phylogenetic levels and this information is used for the revisional work on the genus Trogulus as well as for the generic system of Trogulidae. Family status of Dicranolasmatidae is rejected and Dicranolasma is shown to be best incorporated within Trogulidae. Calathocratus, Platybessobius and Trogulocratus appear to be polyphyletic and are best to be united within Calathocratus, the oldest name of this set. The cryptic diversity within Trogulidae, especially in Trogulus and the composed genus Calathocratus rates to 150-235% and is thereby remarkably high for a group of the generally well researched European fauna. Genetic features of the group such as heteroplasmy, the possibility of major gene rearrangements and usability of the cytochrome b gene for phylogenetic studies in Opiliones are outlined.
Resumo:
In this work supramolecular organic systems based on rigid pi-conjugated building blocks and flexible side chains were studied via solid-state NMR spectroscopy. Specifically, these studies focussed on phenylene ethynylene based macrocycles, polymer systems including polythiophenes, and rod-coil copolymers of oligo(p-benzamide) and poly(ethylene glycol). All systems were studied in terms of the local order and mobility. The central topic of this dissertation was to elucidate the role of the flexible side chains in interplay of different non-covalent interactions, like pi-pi-stacking and hydrogen bonding.Combining the results of this work, it can be concluded that the ratio of the rigid block and the attached alkyl side chains can be crucial for the design of an ordered pi-conjugated supramolecular system. Through alkyl side chains, it is also possible to introduce liquid-crystalline phases in the system, which can foster the local order of the system. Moreover in the studied system longer, unbranched alkyl side chains are better suited to stabilize the corresponding aggregation than shorter, branched ones.The combination of non-covalent interactions such as pi-pi-stacking and hydrogen bonding play an important role for structure formation. However, the effect of pi-pi-stacking interaction is much weaker than the effect of hydrogen bonding and is only observed in systems with a suitable local order. Hence, they are often not strong enough to control the local order. In contrast, hydrogen bonds predominantly influence the structural organization and packing. In comparison the size of the alkyl side chains is only of minor importance. The suppression of certain hydrogen bonds can lead to completely different structures and can induce a specific aggregation behavior. Thus, for the design of a supramolecular ordered system the presence of hydrogen bonding efficiently stabilizes the corresponding structure, but the ratio of hydrogen bond forming groups should be kept low to be able to influence the structure selectively.
Resumo:
Among all possible realizations of quark and antiquark assembly, the nucleon (the proton and the neutron) is the most stable of all hadrons and consequently has been the subject of intensive studies. Mass, shape, radius and more complex representations of its internal structure are measured since several decades using different probes. The proton (spin 1/2) is described by the electric GE and magnetic GM form factors which characterise its internal structure. The simplest way to measure the proton form factors consists in measuring the angular distribution of the electron-proton elastic scattering accessing the so-called Space-Like region where q2 < 0. Using the crossed channel antiproton proton <--> e+e-, one accesses another kinematical region, the so-called Time-Like region where q2 > 0. However, due to the antiproton proton <--> e+e- threshold q2th, only the kinematical domain q2 > q2th > 0 is available. To access the unphysical region, one may use the antiproton proton --> pi0 e+ e- reaction where the pi0 takes away a part of the system energy allowing q2 to be varied between q2th and almost 0. This thesis aims to show the feasibility of such measurements with the PANDA detector which will be installed on the new high intensity antiproton ring at the FAIR facility at Darmstadt. To describe the antiproton proton --> pi0 e+ e- reaction, a Lagrangian based approach is developed. The 5-fold differential cross section is determined and related to linear combinations of hadronic tensors. Under the assumption of one nucleon exchange, the hadronic tensors are expressed in terms of the 2 complex proton electromagnetic form factors. An extraction method which provides an access to the proton electromagnetic form factor ratio R = |GE|/|GM| and for the first time in an unpolarized experiment to the cosine of the phase difference is developed. Such measurements have never been performed in the unphysical region up to now. Extended simulations were performed to show how the ratio R and the cosine can be extracted from the positron angular distribution. Furthermore, a model is developed for the antiproton proton --> pi0 pi+ pi- background reaction considered as the most dangerous one. The background to signal cross section ratio was estimated under different cut combinations of the particle identification information from the different detectors and of the kinematic fits. The background contribution can be reduced to the percent level or even less. The corresponding signal efficiency ranges from a few % to 30%. The precision on the determination of the ratio R and of the cosine is determined using the expected counting rates via Monte Carlo method. A part of this thesis is also dedicated to more technical work with the study of the prototype of the electromagnetic calorimeter and the determination of its resolution.
Resumo:
In this thesis, anodic aluminum oxide (AAO) membranes, which provide well-aligned uniform mesoscopic pores with adjustable pore parameters, were fabricated and successfully utilized as templates for the fabrication of functional organic nanowires, nanorods and the respective well-ordered arrays. The template-assisted patterning technique was successfully applied for the realization of different objectives:rnHigh-density and well-ordered arrays of hole-conducting nanorods composed of cross-linked triphenylamine (TPA) and tetraphenylbenzidine (TPD) derivatives on conductive substrates like ITO/glass have been successfully fabricated. By applying a freeze-drying technique to remove the aqueous medium after the wet-chemical etching of the template, aggregation and collapsing of the rods was prevented and macroscopic areas of perfectly freestanding nanorods were feasible. Based on the hole-conducting nanorod arrays and their subsequent embedding into an electron-conducting polymer matrix via spin-coating, a novel routine concept for the fabrication of well-ordered all-organic bulk heterojunction for organic photovoltaic applications was successfully demonstrated. The increased donor/acceptor interface of the fabricated devices resulted in a remarkable increase of the photoluminescence quenching compared to a planar bilayer morphology. Further, the fundamental working principle of the templating approach for the solution-based all-organic photovoltaic device was demonstrated for the first time.rnFurthermore, in order to broaden the applicability of patterned surfaces, which are feasible via the template-based patterning of functional materials, AAO with hierarchically branched pores were fabricated and utilized as templates. By pursuing the common templating process hierarchically polymeric replicas, which show remarkable similarities with interesting biostructures, like the surface of the lotus leaf and the feet of a gecko, were successfully prepared.rnIn contrast to the direct infiltration of organic functional materials, a novel route for the fabrication of functional nanowires via post-modification of reactive nanowires was established. Therefore, reactive nanowires based on cross-linked pentafluorophenylesters were fabricated by utilizing AAO templates. The post-modification with fluorescent dyes was demonstrated. Furthermore, reactive wires were converted into well-dispersed poly(N-isopropylacrylamide) (PNIPAM) hydrogels, which exhibit a thermal-responsive reversible phase transition. The reversible thermal-responsible swelling of the PNIPAM nanowires exhibited a more than 50 % extended length than in the collapsed PNIPAM state. rnLast but not least, the shape-anisotropic pores of AAO were utilized to uniformly align the mesogens of a nematic liquid crystalline elastomer. Liquid crystalline nanowires with a narrow size distribution and uniform orientation of the liquid crystalline material were fabricated. It was shown that during the transition from the nematic to the isotropic phase the rod’s length shortened by roughly 40 percent. As such these liquid crystalline elastomeric nanowires may find application, as wire-shaped nanoactuators in various fields of research, like lab-on-chip systems, micro fluidics and biomimetics.rn
Resumo:
Nell’ambito della riflessione sui processi di riproduzione e trasmissione intergenerazionale delle disuguaglianze sociali, la presente tesi dottorale propone un’analisi in prospettiva intersezionale dei percorsi verso l’indipendenza abitativa dei giovani bolognesi di origine popolare, e il ruolo svolto dalle loro famiglie in questi percorsi. Nell’approfondimento teorico si offre una ricostruzione del dibattito sociologico nazionale e internazionale sul tema delle disuguaglianze sociali e abitative delle giovani generazioni, nonché dello studio della classe sociale e i principali approcci presenti in letteratura. La ricerca sul campo si concentra, in particolare, sui modi in cui le famiglie appartenenti ad una determinata classe sociale, intesa nel senso conferito al termine da Bourdieu, negoziano e sostengono la transizione all’indipendenza abitativa dei loro figli. L’approfondimento empirico consiste in una ricerca qualitativa longitudinale retrospettiva, condotta nel bolognese nel 2013-2014 su un campione di famiglie working class. Ai giovani-adulti coinvolti e ai loro genitori è stato chiesto di ricostruire le loro biografie attraverso lo strumento delle life histories. La ricerca evidenzia il delinearsi di “micro sistemi economici” familiari specifici e creativi. Le molteplici forme di sostegno genitoriale nei percorsi di autonomizzazione dei figli, individuate nel corso dell’analisi del materiale raccolto, si innestano quindi all’interno di un più ampio sistema di supporto intergenerazionale, che continua anche dopo l’uscita dalla famiglia di origine, a conferma dello stretto legame che caratterizza le famiglie italiane. Nello studio una particolare attenzione viene rivolta alle logiche di legittimazione del sostegno genitoriale, adottate dalle famiglie per orientare i propri interventi di aiuto. Infine, mettendo a confronto la concezione di indipendenza dei giovani adulti con quella dei loro genitori, l’indagine rileva l’esistenza di una apparente contraddizione tra le rappresentazioni dell’indipendenza e dell’autonomia abitativa e i comportamenti attuati nella vita quotidiana, contraddizione che trova una risoluzione nella negoziazione, tra le due diverse generazioni, del concetto stesso di indipendenza.
Resumo:
Urban centers significantly contribute to anthropogenic air pollution, although they cover only a minor fraction of the Earth's land surface. Since the worldwide degree of urbanization is steadily increasing, the anthropogenic contribution to air pollution from urban centers is expected to become more substantial in future air quality assessments. The main objective of this thesis was to obtain a more profound insight in the dispersion and the deposition of aerosol particles from 46 individual major population centers (MPCs) as well as the regional and global influence on the atmospheric distribution of several aerosol types. For the first time, this was assessed in one model framework, for which the global model EMAC was applied with different representations of aerosol particles. First, in an approach with passive tracers and a setup in which the results depend only on the source location and the size and the solubility of the tracers, several metrics and a regional climate classification were used to quantify the major outflow pathways, both vertically and horizontally, and to compare the balance between pollution export away from and pollution build-up around the source points. Then in a more comprehensive approach, the anthropogenic emissions of key trace species were changed at the MPC locations to determine the cumulative impact of the MPC emissions on the atmospheric aerosol burdens of black carbon, particulate organic matter, sulfate, and nitrate. Ten different mono-modal passive aerosol tracers were continuously released at the same constant rate at each emission point. The results clearly showed that on average about five times more mass is advected quasi-horizontally at low levels than exported into the upper troposphere. The strength of the low-level export is mainly determined by the location of the source, while the vertical transport is mainly governed by the lifting potential and the solubility of the tracers. Similar to insoluble gas phase tracers, the low-level export of aerosol tracers is strongest at middle and high latitudes, while the regions of strongest vertical export differ between aerosol (temperate winter dry) and gas phase (tropics) tracers. The emitted mass fraction that is kept around MPCs is largest in regions where aerosol tracers have short lifetimes; this mass is also critical for assessing the impact on humans. However, the number of people who live in a strongly polluted region around urban centers depends more on the population density than on the size of the area which is affected by strong air pollution. Another major result was that fine aerosol particles (diameters smaller than 2.5 micrometer) from MPCs undergo substantial long-range transport, with about half of the emitted mass being deposited beyond 1000 km away from the source. In contrast to this diluted remote deposition, there are areas around the MPCs which experience high deposition rates, especially in regions which are frequently affected by heavy precipitation or are situated in poorly ventilated locations. Moreover, most MPC aerosol emissions are removed over land surfaces. In particular, forests experience more deposition from MPC pollutants than other land ecosystems. In addition, it was found that the generic treatment of aerosols has no substantial influence on the major conclusions drawn in this thesis. Moreover, in the more comprehensive approach, it was found that emissions of black carbon, particulate organic matter, sulfur dioxide, and nitrogen oxides from MPCs influence the atmospheric burden of various aerosol types very differently, with impacts generally being larger for secondary species, sulfate and nitrate, than for primary species, black carbon and particulate organic matter. While the changes in the burdens of sulfate, black carbon, and particulate organic matter show an almost linear response for changes in the emission strength, the formation of nitrate was found to be contingent upon many more factors, e.g., the abundance of sulfuric acid, than only upon the strength of the nitrogen oxide emissions. The generic tracer experiments were further extended to conduct the first risk assessment to obtain the cumulative risk of contamination from multiple nuclear reactor accidents on the global scale. For this, many factors had to be taken into account: the probability of major accidents, the cumulative deposition field of the radionuclide cesium-137, and a threshold value that defines contamination. By collecting the necessary data and after accounting for uncertainties, it was found that the risk is highest in western Europe, the eastern US, and in Japan, where on average contamination by major accidents is expected about every 50 years.
Resumo:
In Rahmen der vorliegenden Arbeit wurde ein neuartiger Zugang zu einer Vielzahl von Polymerstrukturen auf Basis des klinisch zugelassenen Polymers Poly(N-(2-Hydroxypropyl)-methacrylamide) (PHPMA) entwickelt. Der synthetische Zugang beruht zum einen auf der Verwendung von Reaktivesterpolymeren und zum anderen auf der Reversible Addition Fragmentation Chain Transfer (RAFT) Polymerisationsmethode. Diese Form einer kontrollierten radikalischen Polymerisation ermöglichte es, neben der Synthese von besser definierten Homopolymeren auch statistische und Blockcopolymere herzustellen. Die Reaktivesterpolymere können durch einfache Aminolyse in HPMA-basierte Systeme überführt werden. Somit können sie als eine vielversprechende Basis zur Synthese von umfangreichen Polymerbibliotheken angesehen werden. Die hergestellten Polymere kombinieren verschiedene Funktionalitäten bei konstantem Polymerisationsgrad. Dies ermöglicht eine Optimierung auf eine gezielte Anwendung hin ohne den Parameter der Kettenlänge zu verändern.rnIm weiteren war es durch Verwendung der RAFT Polymerisation möglich partiell bioabbaubare Blockcopolymere auf Basis von Polylactiden und HPMA herzustellen, in dem ein Kettentransferreagenz (CTA) an ein wohl definiertes Polylactid Homopolymer gekoppelt wurde. Diese Strukturen wurden in ihrer Zusammensetzung variiert und mit Erkennungsstrukturen (Folaten) und markierenden Elementen (Fluoreszenzfarbstoffe und +-emittierenden Radionukleide) versehen und im weiteren in vitro und in vivo evaluiert.rnAuf Grund dieser Errungenschaften war es möglich den Einfluss der Polymermikrostruktur auf das Aggregationsverhalten hin mittel Lichtstreuung und Fluoreszenzkorrelationsspektroskopie zu untersuchen. Es konnte gezeigt werden, dass erst diese Informationen über die Überstrukturbildung die Kinetik der Zellaufnahme erklären können. Somit wurde die wichtige Rolle von Strukturwirkungsbeziehungen nachgewiesen.rnSomit konnte neben der Synthese, Charakterisierung und ersten biologischen Evaluierungen ein Beitrag zum besseres Verständnis zur Interaktion von polymeren Partikeln mit biologischen Systemen geleistet werden.
Resumo:
Graphene nanoribbons (GNRs), which are defined as nanometer-wide strips of graphene, are attracting an increasing attention as one on the most promising materials for future nanoelectronics. Unlike zero-bandgap graphene that cannot be switched off in transistors, GNRs possess open bandgaps that critically depend on their width and edge structures. GNRs were predominantly prepared through “top-down” methods such as “cutting” of graphene and “unzipping” of carbon nanotubes, but these methods cannot precisely control the structure of the resulting GNRs. In contrast, “bottom-up” chemical synthetic approach enables fabrication of structurally defined and uniform GNRs from tailor-made polyphenylene precursors. Nevertheless, width and length of the GNRs obtainable by this method were considerably limited. In this study, lateral as well as longitudinal extensions of the GNRs were achieved while preserving the high structural definition, based on the bottom-up solution synthesis. Initially, wider (~2 nm) GNRs were synthesized by using laterally expanded monomers through AA-type Yamamoto polymerization, which proved more efficient than the conventional A2B2-type Suzuki polymerization. The wider GNRs showed broad absorption profile extending to the near-infrared region with a low optical bandgap of 1.12 eV, which indicated a potential of such GNRs for the application in photovoltaic cells. Next, high longitudinal extension of narrow (~1 nm) GNRs over 600 nm was accomplished based on AB-type Diels–Alder polymerization, which provided corresponding polyphenylene precursors with the weight-average molecular weight of larger than 600,000 g/mol. Bulky alkyl chains densely installed on the peripheral positions of these GNRs enhanced their liquid-phase processability, which allowed their formation of highly ordered self-assembled monolayers. Furthermore, non-contact time-resolved terahertz spectroscopy measurements demonstrated high charge-carrier mobility within individual GNRs. Remarkably, lateral extension of the AB-type monomer enabled the fabrication of wider (~2 nm) and long (>100 nm) GNRs through the Diels–Alder polymerization. Such longitudinally extended and structurally well-defined GNRs are expected to allow the fabrication of single-ribbon transistors for the fundamental studies on the electronic properties of the GNRs as well as contribute to the development of future electronic devices.
Resumo:
Our growing understanding of human mind and cognition and the development of neurotechnology has triggered debate around cognitive enhancement in neuroethics. The dissertation examines the normative issues of memory enhancement, and focuses on two issues: (1) the distinction between memory treatment and enhancement; and (2) how the issue of authenticity concerns memory interventions, including memory treatments and enhancements. rnThe first part consists of a conceptual analysis of the concepts required for normative considerations. First, the representational nature and the function of memory are discussed. Memory is regarded as a special form of self-representation resulting from a constructive processes. Next, the concepts of selfhood, personhood, and identity are examined and a conceptual tool—the autobiographical self-model (ASM)—is introduced. An ASM is a collection of mental representations of the system’s relations with its past and potential future states. Third, the debate between objectivist and constructivist views of health are considered. I argue for a phenomenological account of health, which is based on the primacy of illness and negative utilitarianism.rnThe second part presents a synthesis of the relevant normative issues based on the conceptual tools developed. I argue that memory enhancement can be distinguished from memory treatment using a demarcation regarding the existence of memory-related suffering. That is, memory enhancements are, under standard circumstances and without any unwilling suffering or potential suffering resulting from the alteration of memory functions, interventions that aim to manipulate memory function based on the self-interests of the individual. I then consider the issue of authenticity, namely whether memory intervention or enhancement endangers “one’s true self”. By analyzing two conceptions of authenticity—authenticity as self-discovery and authenticity as self-creation, I propose that authenticity should be understood in terms of the satisfaction of the functional constraints of an ASM—synchronic coherence, diachronic coherence, and global veridicality. This framework provides clearer criteria for considering the relevant concerns and allows us to examine the moral values of authenticity. rn
Resumo:
With the increasing use of medical imaging in forensics, as well as the technological advances in rapid prototyping, we suggest combining these techniques to generate displays of forensic findings. We used computed tomography (CT), CT angiography, magnetic resonance imaging (MRI) and surface scanning with photogrammetry in conjunction with segmentation techniques to generate 3D polygon meshes. Based on these data sets, a 3D printer created colored models of the anatomical structures. Using this technique, we could create models of bone fractures, vessels, cardiac infarctions, ruptured organs as well as bitemark wounds. The final models are anatomically accurate, fully colored representations of bones, vessels and soft tissue, and they demonstrate radiologically visible pathologies. The models are more easily understood by laypersons than volume rendering or 2D reconstructions. Therefore, they are suitable for presentations in courtrooms and for educational purposes.
Resumo:
According to one’s personal biography, social background and the resultant degree of affectedness, a person has certain ideas about the meaning of, in our example, a World Heritage Site (WHS), what he or she can expect from it and what his or her relation to it can and should be. The handling of potentially different meaningful spaces is decisive, when it comes to the negotiation of pathways towards the sustainable development of a WHS region. Due to the fact that – in a pluralistic world – multiple realities exist, they have to be taken seriously and adequately addressed. In this article we identified the ways the Jungfrau-Aletsch- WHS was constructed by exploring the visual and verbal representations of the WHS during the decision-making process (1998-2001). The results demonstrate that in the visual representations (images), the WHS was to a large extent presented as an unspoiled natural environment similar to a touristy promotion brochure. Such a ‘picture-book’-like portrait has no direct link to the population’s daily needs, their questions and anxieties about the consequences of a WHS label. By contrast, the verbal representations (articles, letters-to-the-editor, comments) were dominated by issues concerning the economic development of the region, fears of disappropriation, and different views on nature. Whereas visual and verbal representations to a large extent differ significantly, their combination might have contributed to the final decision of the majority of people concerned to support the application for inscription of the Jungfrau-Aletsch-Bietschhorn region into the World Heritage list. The prominence of economic arguments and narratives about intergenerational responsibility in the verbal representations and their combination with the aesthetic appeal of the natural environment in the visual representations might have built a common meaningful space for one part of the population.
Resumo:
Pictorial representations of three-dimensional objects are often used to investigate animal cognitive abilities; however, investigators rarely evaluate whether the animals conceptualize the two-dimensional image as the object it is intended to represent. We tested for picture recognition in lion-tailed macaques by presenting five monkeys with digitized images of familiar foods on a touch screen. Monkeys viewed images of two different foods and learned that they would receive a piece of the one they touched first. After demonstrating that they would reliably select images of their preferred foods on one set of foods, animals were transferred to images of a second set of familiar foods. We assumed that if the monkeys recognized the images, they would spontaneously select images of their preferred foods on the second set of foods. Three monkeys selected images of their preferred foods significantly more often than chance on their first transfer session. In an additional test of the monkeys' picture recognition abilities, animals were presented with pairs of food images containing a medium-preference food paired with either a high-preference food or a low-preference food. The same three monkeys selected the medium-preference foods significantly more often when they were paired with low-preference foods and significantly less often when those same foods were paired with high-preference foods. Our novel design provided convincing evidence that macaques recognized the content of two-dimensional images on a touch screen. Results also suggested that the animals understood the connection between the two-dimensional images and the three-dimensional objects they represented.
Resumo:
During the 1870s and 1880s, several British women writers traveled by transcontinental railroad across the American West via Salt Lake City, Utah, the capital of the Church of Jesus Christ of Latter-day Saints, or Mormons. These women subsequently wrote books about their travels for a home audience with a taste for adventures in the American West, and particularly for accounts of Mormon plural marriage, which was sanctioned by the Church before 1890. "The plight of the Mormon woman," a prominent social reform and literary theme of the period, situated Mormon women at the center of popular representations of Utah during the second half of the nineteenth century. "The Mormon question" thus lends itself to an analysis of how a stereotyped subaltern group was represented by elite British travelers. These residents of western American territories, however, differed in important respects from the typical subaltern subjects discussed by Victorian travelers. These white, upwardly mobile, and articulate Mormon plural wives attempted to influence observers' representations of them through a variety of narrative strategies. Both British women travel writers and Mormon women wrote from the margins of power and credibility, and as interpreters of the Mormon scene were concerned to established their representational authority.
Resumo:
The generality of findings implicating secondary auditory areas in auditory imagery was tested by using a timbre imagery task with fMRI. Another aim was to test whether activity in supplementary motor area (SMA) seen in prior studies might have been related to subvocalization. Participants with moderate musical background were scanned while making similarity judgments about the timbre of heard or imagined musical instrument sounds. The critical control condition was a visual imagery task. The pattern of judgments in perceived and imagined conditions was similar, suggesting that perception and imagery access similar cognitive representations of timbre. As expected, judgments of heard timbres, relative to the visual imagery control, activated primary and secondary auditory areas with some right-sided asymmetry. Timbre imagery also activated secondary auditory areas relative to the visual imagery control, although less strongly, in accord with previous data. Significant overlap was observed in these regions between perceptual and imagery conditions. Because the visual control task resulted in deactivation of auditory areas relative to a silent baseline, we interpret the timbre imagery effect as a reversal of that deactivation. Despite the lack of an obvious subvocalization component to timbre imagery, some activity in SMA was observed, suggesting that SMA may have a more general role in imagery beyond any motor component.
Resumo:
Mr. Kubon's project was inspired by the growing need for an automatic, syntactic analyser (parser) of Czech, which could be used in the syntactic processing of large amounts of texts. Mr. Kubon notes that such a tool would be very useful, especially in the field of corpus linguistics, where creating a large-scale "tree bank" (a collection of syntactic representations of natural language sentences) is a very important step towards the investigation of the properties of a given language. The work involved in syntactically parsing a whole corpus in order to get a representative set of syntactic structures would be almost inconceivable without the help of some kind of robust (semi)automatic parser. The need for the automatic natural language parser to be robust increases with the size of the linguistic data in the corpus or in any other kind of text which is going to be parsed. Practical experience shows that apart from syntactically correct sentences, there are many sentences which contain a "real" grammatical error. These sentences may be corrected in small-scale texts, but not generally in the whole corpus. In order to be able to complete the overall project, it was necessary to address a number of smaller problems. These were; 1. the adaptation of a suitable formalism able to describe the formal grammar of the system; 2. the definition of the structure of the system's dictionary containing all relevant lexico-syntactic information, and the development of a formal grammar able to robustly parse Czech sentences from the test suite; 3. filling the syntactic dictionary with sample data allowing the system to be tested and debugged during its development (about 1000 words); 4. the development of a set of sample sentences containing a reasonable amount of grammatical and ungrammatical phenomena covering some of the most typical syntactic constructions being used in Czech. Number 3, building a formal grammar, was the main task of the project. The grammar is of course far from complete (Mr. Kubon notes that it is debatable whether any formal grammar describing a natural language may ever be complete), but it covers the most frequent syntactic phenomena, allowing for the representation of a syntactic structure of simple clauses and also the structure of certain types of complex sentences. The stress was not so much on building a wide coverage grammar, but on the description and demonstration of a method. This method uses a similar approach as that of grammar-based grammar checking. The problem of reconstructing the "correct" form of the syntactic representation of a sentence is closely related to the problem of localisation and identification of syntactic errors. Without a precise knowledge of the nature and location of syntactic errors it is not possible to build a reliable estimation of a "correct" syntactic tree. The incremental way of building the grammar used in this project is also an important methodological issue. Experience from previous projects showed that building a grammar by creating a huge block of metarules is more complicated than the incremental method, which begins with the metarules covering most common syntactic phenomena first, and adds less important ones later, especially from the point of view of testing and debugging the grammar. The sample of the syntactic dictionary containing lexico-syntactical information (task 4) now has slightly more than 1000 lexical items representing all classes of words. During the creation of the dictionary it turned out that the task of assigning complete and correct lexico-syntactic information to verbs is a very complicated and time-consuming process which would itself be worth a separate project. The final task undertaken in this project was the development of a method allowing effective testing and debugging of the grammar during the process of its development. The problem of the consistency of new and modified rules of the formal grammar with the rules already existing is one of the crucial problems of every project aiming at the development of a large-scale formal grammar of a natural language. This method allows for the detection of any discrepancy or inconsistency of the grammar with respect to a test-bed of sentences containing all syntactic phenomena covered by the grammar. This is not only the first robust parser of Czech, but also one of the first robust parsers of a Slavic language. Since Slavic languages display a wide range of common features, it is reasonable to claim that this system may serve as a pattern for similar systems in other languages. To transfer the system into any other language it is only necessary to revise the grammar and to change the data contained in the dictionary (but not necessarily the structure of primary lexico-syntactic information). The formalism and methods used in this project can be used in other Slavic languages without substantial changes.