980 resultados para Syntactic Projection


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on molecular mechanisms of carcinogenesis plays an important role in diagnosing and treating gastric cancer. Metabolic profiling may offer the opportunity to understand the molecular mechanism of carcinogenesis and help to non-invasively identify the potential biomarkers for the early diagnosis of human gastric cancer. The aims of this study were to explore the underlying metabolic mechanisms of gastric cancer and to identify biomarkers associated with morbidity. Gas chromatography/mass spectrometry (GC/MS) was used to analyze the serum metabolites of 30 Chinese gastric cancer patients and 30 healthy controls. Diagnostic models for gastric cancer were constructed using orthogonal partial least squares discriminant analysis (OPLS-DA). Acquired metabolomic data were analyzed by the nonparametric Wilcoxon test to find serum metabolic biomarkers for gastric cancer. The OPLS-DA model showed adequate discrimination between cancer and non-cancer cohorts while the model failed to discriminate different pathological stages (I-IV) of gastric cancer patients. A total of 44 endogenous metabolites such as amino acids, organic acids, carbohydrates, fatty acids, and steroids were detected, of which 18 differential metabolites were identified with significant differences. A total of 13 variables were obtained for their greatest contribution in the discriminating OPLS-DA model [variable importance in the projection (VIP) value >1.0], among which 11 metabolites were identified using both VIP values (VIP >1) and the Wilcoxon test. These metabolites potentially revealed perturbations of glycolysis and of amino acid, fatty acid, cholesterol, and nucleotide metabolism of gastric cancer patients. These results suggest that gastric cancer serum metabolic profiling has great potential in detecting this disease and helping to understand its metabolic mechanisms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The single photon emission microscope (SPEM) is an instrument developed to obtain high spatial resolution single photon emission computed tomography (SPECT) images of small structures inside the mouse brain. SPEM consists of two independent imaging devices, which combine a multipinhole collimator, a high-resolution, thallium-doped cesium iodide [CsI(Tl)] columnar scintillator, a demagnifying/intensifier tube, and an electron-multiplying charge-coupling device (CCD). Collimators have 300- and 450-µm diameter pinholes on tungsten slabs, in hexagonal arrays of 19 and 7 holes. Projection data are acquired in a photon-counting strategy, where CCD frames are stored at 50 frames per second, with a radius of rotation of 35 mm and magnification factor of one. The image reconstruction software tool is based on the maximum likelihood algorithm. Our aim was to evaluate the spatial resolution and sensitivity attainable with the seven-pinhole imaging device, together with the linearity for quantification on the tomographic images, and to test the instrument in obtaining tomographic images of different mouse organs. A spatial resolution better than 500 µm and a sensitivity of 21.6 counts·s-1·MBq-1 were reached, as well as a correlation coefficient between activity and intensity better than 0.99, when imaging 99mTc sources. Images of the thyroid, heart, lungs, and bones of mice were registered using 99mTc-labeled radiopharmaceuticals in times appropriate for routine preclinical experimentation of <1 h per projection data set. Detailed experimental protocols and images of the aforementioned organs are shown. We plan to extend the instrument's field of view to fix larger animals and to combine data from both detectors to reduce the acquisition time or applied activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä työssä testattiin partikkelikokojakaumien analysoinnissa käytettävää kuvankäsittelyohjelmaa INCA Feature. Partikkelikokojakaumat määritettiin elektronimikroskooppikuvista INCA Feature ohjelmaa käyttäen partikkeleiden projektiokuvista päällystyspigmenttinä käytettävälle talkille ja kahdelle eri karbonaattilaadulle. Lisäksi määritettiin partikkelikokojakaumat suodatuksessa ja puhdistuksessa apuaineina käytettäville piidioksidi- ja alumiinioksidihiukkasille. Kuvankäsittelyohjelmalla määritettyjä partikkelikokojakaumia verrattiin partikkelin laskeutumisnopeuteen eli sedimentaatioon perustuvalla SediGraph 5100 analysaattorilla ja laserdiffraktioon perustuvalla Coulter LS 230 menetelmällä analysoituihin partikkelikokojakaumiin. SediGraph 5100 ja kuva-analyysiohjelma antoivat talkkipartikkelien kokojakaumalle hyvin samankaltaisen keskiarvon. Sen sijaan Coulter LS 230 laitteen antama kokojakauman keskiarvo poikkesi edellisistä. Kaikki vertailussa olleet partikkelikokojakaumamenetelmät asettivat eri näytteiden partikkelit samaan kokojärjestykseen. Kuitenkaan menetelmien tuloksia ei voida numeerisesti verrata toisiinsa, sillä kaikissa käytetyissä analyysimenetelmissä partikkelikoon mittaus perustuu partikkelin eri ominaisuuteen. Työn perusteella kaikki testatut analyysimenetelmät soveltuvat paperipigmenttien partikkelikokojakaumien määrittämiseen. Tässä työssä selvitettiin myös kuva-analyysiin tarvittava partikkelien lukumäärä, jolla analyysitulos on luotettava. Työssä todettiin, että analysoitavien partikkelien lukumäärän tulee olla vähintään 300 partikkelia. Liian suuri näytemäärä lisää kokojakauman hajontaa ja pidentää analyysiin käytettyä aikaa useaan tuntiin. Näytteenkäsittely vaatii vielä lisää tutkimuksia, sillä se on tärkein ja kriittisin vaihe SEM ja kuva-analyysiohjelmalla tehtävää partikkelikokoanalyysiä. Automaattisten mikroskooppien yleistyminen helpottaa ja nopeuttaa analyysien tekoa, jolloin menetelmän suosio tulee kasvamaan myös paperipigmenttien tutkimuksessa. Laitteiden korkea hinta ja käyttäjältä vaadittava eritysosaaminen tulevat rajaamaan käytön ainakin toistaiseksi tutkimuslaitoksiin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variante(s) de titre : Mémoire sur la projection des cartes géographiques, adoptée au Dépôt général de la Guerre

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: Nietzsche's Will-to-Power Ontology: An Interpretation of Beyond Good and Evil § 36 By: Mark Minuk Will-to-power is the central component of Nietzsche's philosophy, and passage 36 of Beyond Good and Evil is essential to coming to an understanding of it. 1 argue for and defend the thesis that will-to-power constitutes Nietzsche's ontology, and offer a new understanding of what that means. Nietzsche's ontology can be talked about as though it were a traditional substance ontology (i.e., a world made up of forces; a duality of conflicting forces described as 'towards which' and 'away from which'). However, 1 argue that what defines this ontology is an understanding of valuation as ontologically fundamental—^the basis of interpretation, and from which a substance ontology emerges. In the second chapter, I explain Nietzsche's ontology, as reflected in this passage, through a discussion of Heidegger's two ontological categories in Being and Time (readiness-to-hand, and present-at-hand). In a nutshell, it means that the world of our desires and passions (the most basic of which is for power) is ontologically more fundamental than the material world, or any other interpretation, which is to say, the material world emerges out of a world of our desires and passions. In the first chapter, I address the problematic form of the passage reflected in the first sentence. The passage is in a hypothetical style makes no claim to positive knowledge or truth, and, superficially, looks like Schopenhaurian position for the metaphysics of the will, which Nietzsche rejects. 1 argue that the hypothetical form of the passage is a matter of style, namely, the style of a free-spirit for whom the question of truth is reframed as a question of values. In the third and final chapter, 1 address the charge that Nietzsche's interpretation is a conscious anthropomorphic projection. 1 suggest that the charge rests on a distinction (between nature and man) that Nietzsche rejects. I also address the problem of the causality of the will for Nietzsche, by suggesting that an alternative, perspectival form of causality is possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An ascending cholinergic projection, which originates in the laterodorsal tegmental nucleus (LDT), was implicated in the initiation of ultrasonic vocalization. The goal of this study was to histochemically examine the activity the LDT following ultrasonic calls induced by two methods. It was hypothesized that cholinergic LDT cells would be more active during air puffinduced vocalization than carbachol-induced one. Choline acetyltransferase (ChAT), and cFos protein were visualized histochemically as markers of cholinergic calls and cellular activity, respectively. Results indicated that animals vocalizing after carbachol, but not after air puff, had a significantly higher number of Fos labeled nuclei within the LDT than non vocalizing controls. A significantly higher number of doublelabeled neurons were discovered in the LDT of vocalizing animals (in both groups) as compared to control conditions. Thus, there were significantly more active cholinergic cells in the LDT of vocalizing than non-vocalizing rats for both methods of call induction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The streams flowing through the Niagara Escarpment are paved by coarse carbonate and sandstone sediments which have originated from the escarpment units and can be traced downstream from their source. Fifty-nine sediment samples were taken from five streams, over distances of 3,000 to 10,000 feet (915 to 3050 m), to determine downstream changes in sediment composition, textural characteristics and sorting. In addition, fluorometric velocity measurements were used in conjunction with measured -discharge and flow records to estimate the frequency of sediment movement. The frequency of sediments of a given lithology changes downstream in direct response to the outcrop position of the formations in the channels. Clasts derived from a single stratigraphic unit usually reach a maximum frequency within the first 1,000 feet (305 m) of transport. Sediments derived from formations at the top of waterfalls reach a modal frequency farther downstream than material originating at the base of waterfalls. Downstream variations in sediment size over the lengths of the study reaches reflect the changes in channel morphology and lithologic composition of the sediment samples. Linear regression analyses indicate that there is a decrease in the axial lengths between the intial and final samples and that the long axis decreases in length more rapidly than the intermediate, while the short axis remains almost constant. Carbonate sediments from coarse-grained, fossiliferous units - iii - are more variable in size than fine-grained dolostones and sandstones. The average sphericity for carbonates and sandstones increases from 0.65 to 0.67, while maximum projection sphericity remains nearly constant with an average value of 0.52. Pebble roundness increases more rapidly than either of the sphericity parameters and the sediments change from subrounded to rounded. The Hjulstrom diagram indicates that the velocities required to initiate transport of sediments with an average intermediate diameter of 10 cm range from 200 cm/s to 300 cm/s (6.6 ft./sec. to 9.8 ft./sec.). From the modal velocitydischarge relations, the flows corresponding to these velocities are greater than 3,500 cfs (99 m3s). These discharges occur less than 0.01 p~r cent (0.4 days) of the time and correspond to a discharge occurring during the spring flood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, I build upon my previous research in which I focus on religious doctrine as a gendered disciplinary apparatus, and examine the witch trials in early modem England and Italy in light of socio-economic issues relating to gender and class. This project examines the witch hunts/trials and early modem visual representations of witches, and what I suggest is an attempt to create docile bodies out of members of society who are deemed unruly, problematic and otherwise 'undesirable'; it is the witch's body that is deemed counternormative. This study demonstrates that it is neighbours and other acquaintances of accused witches that take on the role of the invisible guard of Bantham's Panoptic on. As someone who is trained in the study of English literature and literary theory, my approach is one that is informed by this methodology. It is my specialization in early modem British literature that first exposed me to witch-hunting manuals and tales of the supernatural, and it is for this reason that my research commences with a study of representations of witches and witchcraft in early modem England. From my initial exposure to such materials I proceed to examine the similarities and the differences of the cultural significance of the supernatural vis-a.-vis women's activities in early modem Italy. The subsequent discussion of visual representations of witches involves a predominance of Germanic artists, as the seminal work on the discernment of witches and the application of punishment known as the Malleus Meleficarum, was written in Germany circa 1486. Textual accounts of witch trials such as: "A Pitiless Mother (1616)," "The Wonderful Discovery of the Witchcrafts of Margaret and Philippa Flower (1619)," "Magic and Poison: The Trial ofChiaretta and Fedele (circa 1550)", and the "The Case of Benvegnuda Pincinella: Medicine Woman or Witch (1518),"and witchhunting manuals such as the Malleus Melejicarum and Strix will be put in direct dialogue with visual representations of witches in light of historical discourses pertaining to gender performance and gendered expectations. Issues relating to class will be examined as they pertain to the material conditions of presumed witches. The dominant group in any temporal or geographic location possesses the tools of representation. Therefore, it is not surprising that the physical characteristics, sexual habits and social material conditions that are attributed to suspected witches are attributes that can be deemed deviant by the ruling class. The research will juxtapose the social material conditions of suspected witches with the guilt, anxiety, and projection of fear that the dominant groups experienced in light of the changing economic landscape of the Renaissance. The shift from feudalism to primitive accumulation, and capitalism saw a rise in people living in poverty and therefore an increased dependence upon the good will of others. I will discuss the social material conditions of accused witches as informed by what Robyn Wiegman terms a "minoritizing discourse" (210). People of higher economic standing often blamed their social, medical, and/or economic difficulties on the less fortunate, resulting in accusations of witchcraft.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The initial timing of face-specific effects in event-related potentials (ERPs) is a point of contention in face processing research. Although effects during the time of the N170 are robust in the literature, inconsistent effects during the time of the P100 challenge the interpretation of the N170 as being the initial face-specific ERP effect. The interpretation of the early P100 effects are often attributed to low-level differences between face stimuli and a host of other image categories. Research using sophisticated controls for low-level stimulus characteristics (Rousselet, Husk, Bennett, & Sekuler, 2008) report robust face effects starting at around 130 ms following stimulus onset. The present study examines the independent components (ICs) of the P100 and N170 complex in the context of a minimally controlled low-level stimulus set and a clear P100 effect for faces versus houses at the scalp. Results indicate that four ICs account for the ERPs to faces and houses in the first 200ms following stimulus onset. The IC that accounts for the majority of the scalp N170 (icNla) begins dissociating stimulus conditions at approximately 130 ms, closely replicating the scalp results of Rousselet et al. (2008). The scalp effects at the time of the P100 are accounted for by two constituent ICs (icP1a and icP1b). The IC that projects the greatest voltage at the scalp during the P100 (icP1a) shows a face-minus-house effect over the period of the P100 that is less robust than the N 170 effect of icN 1 a when measured as the average of single subject differential activation robustness. The second constituent process of the P100 (icP1b), although projecting a smaller voltage to the scalp than icP1a, shows a more robust effect for the face-minus-house contrast starting prior to 100 ms following stimulus onset. Further, the effect expressed by icP1 b takes the form of a larger negative projection to medial occipital sites for houses over faces partially canceling the larger projection of icP1a, thereby enhancing the face positivity at this time. These findings have three main implications for ERP research on face processing: First, the ICs that constitute the face-minus-house P100 effect are independent from the ICs that constitute the N170 effect. This suggests that the P100 effect and the N170 effect are anatomically independent. Second, the timing of the N170 effect can be recovered from scalp ERPs that have spatio-temporally overlapping effects possibly associated with low-level stimulus characteristics. This unmixing of the EEG signals may reduce the need for highly constrained stimulus sets, a characteristic that is not always desirable for a topic that is highly coupled to ecological validity. Third, by unmixing the constituent processes of the EEG signals new analysis strategies are made available. In particular the exploration of the relationship between cortical processes over the period of the P100 and N170 ERP complex (and beyond) may provide previously unaccessible answers to questions such as: Is the face effect a special relationship between low-level and high-level processes along the visual stream?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ce texte propose des méthodes d’inférence exactes (tests et régions de confiance) sur des modèles de régression linéaires avec erreurs autocorrélées suivant un processus autorégressif d’ordre deux [AR(2)], qui peut être non stationnaire. L’approche proposée est une généralisation de celle décrite dans Dufour (1990) pour un modèle de régression avec erreurs AR(1) et comporte trois étapes. Premièrement, on construit une région de confiance exacte pour le vecteur des coefficients du processus autorégressif (φ). Cette région est obtenue par inversion de tests d’indépendance des erreurs sur une forme transformée du modèle contre des alternatives de dépendance aux délais un et deux. Deuxièmement, en exploitant la dualité entre tests et régions de confiance (inversion de tests), on détermine une région de confiance conjointe pour le vecteur φ et un vecteur d’intérêt M de combinaisons linéaires des coefficients de régression du modèle. Troisièmement, par une méthode de projection, on obtient des intervalles de confiance «marginaux» ainsi que des tests à bornes exacts pour les composantes de M. Ces méthodes sont appliquées à des modèles du stock de monnaie (M2) et du niveau des prix (indice implicite du PNB) américains

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Un document accompagne le mémoire et est disponible pour consultation au Centre de conservation des bibliothèques de l'Université de Montréal (http://www.bib.umontreal.ca/conservation/).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signal relay by guidance receptors at the axonal growth cone is a process essential for the assembly of a functional nervous system. We investigated the in vivo function of Src family kinases (SFKs) as growth cone guidance signaling intermediates in the context of spinal lateral motor column (LMC) motor axon projection toward the ventral or dorsal limb mesenchyme. Using in situ mRNA detection we determined that Src and Fyn are expressed in LMC motor neurons of chick and mouse embryos at the time of limb trajectory selection. Inhibition of SFK activity by C-terminal Src kinase (Csk) overexpression in chickLMCaxons using in ovo electroporation resulted inLMC axons selecting the inappropriate dorsoventral trajectory within the limb mesenchyme, with medial LMC axon projecting into the dorsal and ventral limb nerve with apparently random incidence. We also detected LMC axon trajectory choice errors in Src mutant mice demonstrating a nonredundant role for Src in motor axon guidance in agreement with gain and loss of Src function in chickLMCneurons which led to the redirection ofLMCaxons. Finally, Csk-mediated SFK inhibition attenuated the retargeting ofLMCaxons caused by EphA or EphB over-expression, implying the participation of SFKs in Eph-mediated LMC motor axon guidance. In summary, our findings demonstrate that SFKs are essential for motor axon guidance and suggest that they play an important role in relaying ephrin:Eph signals that mediate the selection of motor axon trajectory in the limb.