873 resultados para Coarse-to-fine processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hole 1256C was cored 88.5 m into basement, and Hole 1256D, the deep reentry hole, was cored 502 m into basement during Ocean Drilling Program Leg 206. Hole 1256D is located ~30 m south of Hole 1256C (Wilson, Teagle, Acton, et al., 2003, doi:10.2973/odp.proc.ir.206.2003). A thick massive flow drilled in both holes, Units 1256C-18 and 1256D-1, consists of a single cooling unit of cryptocrystalline to fine-grained basalt, interpreted as a ponded lava, 32 m and at least 74.2 m thick, respectively. This ponded flow gives us a unique opportunity to examine textural variations from the glassy, folded crust of the lava pond recovered from the top of Unit 1256C-18 through the coarse-grained, thick massive lava body to the unusually recrystallized and deformed base cored in Unit 1256C-18. Some detailed descriptions of the textures and grain size variations through the lava pond (Units 1256C-18 and 1256D-1), with special reference to the recrystallization of the base of Unit 1256C-18, are presented here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We recently put forth a model of a protochlorophyllide (Pchlide) light-harvesting complex operative during angiosperm seedling de-etiolation (Reinbothe, C., Lebedev, N., and Reinbothe, S. (1999) Nature 397, 80–84). This model, which was based on in vitro reconstitution experiments with zinc analogs of Pchlide a and Pchlide b and the two NADPH:protochlorophyllide oxidoreductases (PORs), PORA and PORB, of barley, predicted a 5-fold excess of Pchlide b, relative to Pchlide a, in the prolamellar body of etioplasts. Recent work (Scheumann, V., Klement, H., Helfrich, M., Oster, U., Schoch, S., and Rüdiger, W. (1999) FEBS Lett. 445, 445–448), however, contradicted this model and reported that Pchlide b would not be present in etiolated plants. Here we demonstrate that Pchlide b is an abundant pigment in barley etioplasts but is rather metabolically unstable. It is rapidly converted to Pchlide a by virtue of 7-formyl reductase activity, an enzyme that had previously been implicated in the chlorophyll (Chl) b to Chl a reaction cycle. Our findings suggest that etiolated plants make use of 7-formyl reductase to fine tune the levels of Pchlide b and Pchlidea and thereby may regulate the steady-state level of light-harvesting POR-Pchlide comple

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuronal morphology is a key feature in the study of brain circuits, as it is highly related to information processing and functional identification. Neuronal morphology affects the process of integration of inputs from other neurons and determines the neurons which receive the output of the neurons. Different parts of the neurons can operate semi-independently according to the spatial location of the synaptic connections. As a result, there is considerable interest in the analysis of the microanatomy of nervous cells since it constitutes an excellent tool for better understanding cortical function. However, the morphologies, molecular features and electrophysiological properties of neuronal cells are extremely variable. Except for some special cases, this variability makes it hard to find a set of features that unambiguously define a neuronal type. In addition, there are distinct types of neurons in particular regions of the brain. This morphological variability makes the analysis and modeling of neuronal morphology a challenge. Uncertainty is a key feature in many complex real-world problems. Probability theory provides a framework for modeling and reasoning with uncertainty. Probabilistic graphical models combine statistical theory and graph theory to provide a tool for managing domains with uncertainty. In particular, we focus on Bayesian networks, the most commonly used probabilistic graphical model. In this dissertation, we design new methods for learning Bayesian networks and apply them to the problem of modeling and analyzing morphological data from neurons. The morphology of a neuron can be quantified using a number of measurements, e.g., the length of the dendrites and the axon, the number of bifurcations, the direction of the dendrites and the axon, etc. These measurements can be modeled as discrete or continuous data. The continuous data can be linear (e.g., the length or the width of a dendrite) or directional (e.g., the direction of the axon). These data may follow complex probability distributions and may not fit any known parametric distribution. Modeling this kind of problems using hybrid Bayesian networks with discrete, linear and directional variables poses a number of challenges regarding learning from data, inference, etc. In this dissertation, we propose a method for modeling and simulating basal dendritic trees from pyramidal neurons using Bayesian networks to capture the interactions between the variables in the problem domain. A complete set of variables is measured from the dendrites, and a learning algorithm is applied to find the structure and estimate the parameters of the probability distributions included in the Bayesian networks. Then, a simulation algorithm is used to build the virtual dendrites by sampling values from the Bayesian networks, and a thorough evaluation is performed to show the model’s ability to generate realistic dendrites. In this first approach, the variables are discretized so that discrete Bayesian networks can be learned and simulated. Then, we address the problem of learning hybrid Bayesian networks with different kinds of variables. Mixtures of polynomials have been proposed as a way of representing probability densities in hybrid Bayesian networks. We present a method for learning mixtures of polynomials approximations of one-dimensional, multidimensional and conditional probability densities from data. The method is based on basis spline interpolation, where a density is approximated as a linear combination of basis splines. The proposed algorithms are evaluated using artificial datasets. We also use the proposed methods as a non-parametric density estimation technique in Bayesian network classifiers. Next, we address the problem of including directional data in Bayesian networks. These data have some special properties that rule out the use of classical statistics. Therefore, different distributions and statistics, such as the univariate von Mises and the multivariate von Mises–Fisher distributions, should be used to deal with this kind of information. In particular, we extend the naive Bayes classifier to the case where the conditional probability distributions of the predictive variables given the class follow either of these distributions. We consider the simple scenario, where only directional predictive variables are used, and the hybrid case, where discrete, Gaussian and directional distributions are mixed. The classifier decision functions and their decision surfaces are studied at length. Artificial examples are used to illustrate the behavior of the classifiers. The proposed classifiers are empirically evaluated over real datasets. We also study the problem of interneuron classification. An extensive group of experts is asked to classify a set of neurons according to their most prominent anatomical features. A web application is developed to retrieve the experts’ classifications. We compute agreement measures to analyze the consensus between the experts when classifying the neurons. Using Bayesian networks and clustering algorithms on the resulting data, we investigate the suitability of the anatomical terms and neuron types commonly used in the literature. Additionally, we apply supervised learning approaches to automatically classify interneurons using the values of their morphological measurements. Then, a methodology for building a model which captures the opinions of all the experts is presented. First, one Bayesian network is learned for each expert, and we propose an algorithm for clustering Bayesian networks corresponding to experts with similar behaviors. Then, a Bayesian network which represents the opinions of each group of experts is induced. Finally, a consensus Bayesian multinet which models the opinions of the whole group of experts is built. A thorough analysis of the consensus model identifies different behaviors between the experts when classifying the interneurons in the experiment. A set of characterizing morphological traits for the neuronal types can be defined by performing inference in the Bayesian multinet. These findings are used to validate the model and to gain some insights into neuron morphology. Finally, we study a classification problem where the true class label of the training instances is not known. Instead, a set of class labels is available for each instance. This is inspired by the neuron classification problem, where a group of experts is asked to individually provide a class label for each instance. We propose a novel approach for learning Bayesian networks using count vectors which represent the number of experts who selected each class label for each instance. These Bayesian networks are evaluated using artificial datasets from supervised learning problems. Resumen La morfología neuronal es una característica clave en el estudio de los circuitos cerebrales, ya que está altamente relacionada con el procesado de información y con los roles funcionales. La morfología neuronal afecta al proceso de integración de las señales de entrada y determina las neuronas que reciben las salidas de otras neuronas. Las diferentes partes de la neurona pueden operar de forma semi-independiente de acuerdo a la localización espacial de las conexiones sinápticas. Por tanto, existe un interés considerable en el análisis de la microanatomía de las células nerviosas, ya que constituye una excelente herramienta para comprender mejor el funcionamiento de la corteza cerebral. Sin embargo, las propiedades morfológicas, moleculares y electrofisiológicas de las células neuronales son extremadamente variables. Excepto en algunos casos especiales, esta variabilidad morfológica dificulta la definición de un conjunto de características que distingan claramente un tipo neuronal. Además, existen diferentes tipos de neuronas en regiones particulares del cerebro. La variabilidad neuronal hace que el análisis y el modelado de la morfología neuronal sean un importante reto científico. La incertidumbre es una propiedad clave en muchos problemas reales. La teoría de la probabilidad proporciona un marco para modelar y razonar bajo incertidumbre. Los modelos gráficos probabilísticos combinan la teoría estadística y la teoría de grafos con el objetivo de proporcionar una herramienta con la que trabajar bajo incertidumbre. En particular, nos centraremos en las redes bayesianas, el modelo más utilizado dentro de los modelos gráficos probabilísticos. En esta tesis hemos diseñado nuevos métodos para aprender redes bayesianas, inspirados por y aplicados al problema del modelado y análisis de datos morfológicos de neuronas. La morfología de una neurona puede ser cuantificada usando una serie de medidas, por ejemplo, la longitud de las dendritas y el axón, el número de bifurcaciones, la dirección de las dendritas y el axón, etc. Estas medidas pueden ser modeladas como datos continuos o discretos. A su vez, los datos continuos pueden ser lineales (por ejemplo, la longitud o la anchura de una dendrita) o direccionales (por ejemplo, la dirección del axón). Estos datos pueden llegar a seguir distribuciones de probabilidad muy complejas y pueden no ajustarse a ninguna distribución paramétrica conocida. El modelado de este tipo de problemas con redes bayesianas híbridas incluyendo variables discretas, lineales y direccionales presenta una serie de retos en relación al aprendizaje a partir de datos, la inferencia, etc. En esta tesis se propone un método para modelar y simular árboles dendríticos basales de neuronas piramidales usando redes bayesianas para capturar las interacciones entre las variables del problema. Para ello, se mide un amplio conjunto de variables de las dendritas y se aplica un algoritmo de aprendizaje con el que se aprende la estructura y se estiman los parámetros de las distribuciones de probabilidad que constituyen las redes bayesianas. Después, se usa un algoritmo de simulación para construir dendritas virtuales mediante el muestreo de valores de las redes bayesianas. Finalmente, se lleva a cabo una profunda evaluaci ón para verificar la capacidad del modelo a la hora de generar dendritas realistas. En esta primera aproximación, las variables fueron discretizadas para poder aprender y muestrear las redes bayesianas. A continuación, se aborda el problema del aprendizaje de redes bayesianas con diferentes tipos de variables. Las mixturas de polinomios constituyen un método para representar densidades de probabilidad en redes bayesianas híbridas. Presentamos un método para aprender aproximaciones de densidades unidimensionales, multidimensionales y condicionales a partir de datos utilizando mixturas de polinomios. El método se basa en interpolación con splines, que aproxima una densidad como una combinación lineal de splines. Los algoritmos propuestos se evalúan utilizando bases de datos artificiales. Además, las mixturas de polinomios son utilizadas como un método no paramétrico de estimación de densidades para clasificadores basados en redes bayesianas. Después, se estudia el problema de incluir información direccional en redes bayesianas. Este tipo de datos presenta una serie de características especiales que impiden el uso de las técnicas estadísticas clásicas. Por ello, para manejar este tipo de información se deben usar estadísticos y distribuciones de probabilidad específicos, como la distribución univariante von Mises y la distribución multivariante von Mises–Fisher. En concreto, en esta tesis extendemos el clasificador naive Bayes al caso en el que las distribuciones de probabilidad condicionada de las variables predictoras dada la clase siguen alguna de estas distribuciones. Se estudia el caso base, en el que sólo se utilizan variables direccionales, y el caso híbrido, en el que variables discretas, lineales y direccionales aparecen mezcladas. También se estudian los clasificadores desde un punto de vista teórico, derivando sus funciones de decisión y las superficies de decisión asociadas. El comportamiento de los clasificadores se ilustra utilizando bases de datos artificiales. Además, los clasificadores son evaluados empíricamente utilizando bases de datos reales. También se estudia el problema de la clasificación de interneuronas. Desarrollamos una aplicación web que permite a un grupo de expertos clasificar un conjunto de neuronas de acuerdo a sus características morfológicas más destacadas. Se utilizan medidas de concordancia para analizar el consenso entre los expertos a la hora de clasificar las neuronas. Se investiga la idoneidad de los términos anatómicos y de los tipos neuronales utilizados frecuentemente en la literatura a través del análisis de redes bayesianas y la aplicación de algoritmos de clustering. Además, se aplican técnicas de aprendizaje supervisado con el objetivo de clasificar de forma automática las interneuronas a partir de sus valores morfológicos. A continuación, se presenta una metodología para construir un modelo que captura las opiniones de todos los expertos. Primero, se genera una red bayesiana para cada experto y se propone un algoritmo para agrupar las redes bayesianas que se corresponden con expertos con comportamientos similares. Después, se induce una red bayesiana que modela la opinión de cada grupo de expertos. Por último, se construye una multired bayesiana que modela las opiniones del conjunto completo de expertos. El análisis del modelo consensuado permite identificar diferentes comportamientos entre los expertos a la hora de clasificar las neuronas. Además, permite extraer un conjunto de características morfológicas relevantes para cada uno de los tipos neuronales mediante inferencia con la multired bayesiana. Estos descubrimientos se utilizan para validar el modelo y constituyen información relevante acerca de la morfología neuronal. Por último, se estudia un problema de clasificación en el que la etiqueta de clase de los datos de entrenamiento es incierta. En cambio, disponemos de un conjunto de etiquetas para cada instancia. Este problema está inspirado en el problema de la clasificación de neuronas, en el que un grupo de expertos proporciona una etiqueta de clase para cada instancia de manera individual. Se propone un método para aprender redes bayesianas utilizando vectores de cuentas, que representan el número de expertos que seleccionan cada etiqueta de clase para cada instancia. Estas redes bayesianas se evalúan utilizando bases de datos artificiales de problemas de aprendizaje supervisado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A first study in order to construct a simple model of the mammalian retina is reported. The basic elements for this model are Optical Programmable Logic Cells, OPLCs, previously employed as a functional element for Optical Computing. The same type of circuit simulates the five types of neurons present in the retina. Different responses are obtained by modifying either internal or external connections. Two types of behaviors are reported: symmetrical and non-symmetrical with respect to light position. Some other higher functions, as the possibility to differentiate between symmetric and non-symmetric light images, are performed by another simulation of the first layers of the visual cortex. The possibility to apply these models to image processing is reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In recent years, Spain has implemented a number of air quality control measures that are expected to lead to a future reduction in fine particle concentrations and an ensuing positive impact on public health. Objectives: We aimed to assess the impact on mortality attributable to a reduction in fine particle levels in Spain in 2014 in relation to the estimated level for 2007. Methods: To estimate exposure, we constructed fine particle distribution models for Spain for 2007 (reference scenario) and 2014 (projected scenario) with a spatial resolution of 16x16 km2. In a second step, we used the concentration-response functions proposed by cohort studies carried out in Europe (European Study of Cohorts for Air Pollution Effects and Rome longitudinal cohort) and North America (American Cancer Society cohort, Harvard Six Cities study and Canadian national cohort) to calculate the number of attributable annual deaths corresponding to all causes, all non-accidental causes, ischemic heart disease and lung cancer among persons aged over 25 years (2005-2007 mortality rate data). We examined the effect of the Spanish demographic shift in our analysis using 2007 and 2012 population figures. Results: Our model suggested that there would be a mean overall reduction in fine particle levels of 1mg/m3 by 2014. Taking into account 2007 population data, between 8 and 15 all-cause deaths per 100,000 population could be postponed annually by the expected reduction in fine particle levels. For specific subgroups, estimates varied from 10 to 30 deaths for all non-accidental causes, from 1 to 5 for lung cancer, and from 2 to 6 for ischemic heart disease. The expected burden of preventable mortality would be even higher in the future due to the Spanish population growth. Taking into account the population older than 30 years in 2012, the absolute mortality impact estimate would increase approximately by 18%. Conclusions: Effective implementation of air quality measures in Spain, in a scenario with a short-term projection, would amount to an appreciable decline infine particle concentrations, and this, in turn, would lead to notable health-related benefits. Recent European cohort studies strengthen the evidence of an association between long-term exposure to fine particles and health effects, and could enhance the health impact quantification in Europe. Air quality models can contribute to improved assessment of air pollution health impact estimates, particularly in study areas without air pollution monitoring data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structural connectivity of the brain is considered to encode species-wise and subject-wise patterns that will unlock large areas of understanding of the human brain. Currently, diffusion MRI of the living brain enables to map the microstructure of tissue, allowing to track the pathways of fiber bundles connecting the cortical regions across the brain. These bundles are summarized in a network representation called connectome that is analyzed using graph theory. The extraction of the connectome from diffusion MRI requires a large processing flow including image enhancement, reconstruction, segmentation, registration, diffusion tracking, etc. Although a concerted effort has been devoted to the definition of standard pipelines for the connectome extraction, it is still crucial to define quality assessment protocols of these workflows. The definition of quality control protocols is hindered by the complexity of the pipelines under test and the absolute lack of gold-standards for diffusion MRI data. Here we characterize the impact on structural connectivity workflows of the geometrical deformation typically shown by diffusion MRI data due to the inhomogeneity of magnetic susceptibility across the imaged object. We propose an evaluation framework to compare the existing methodologies to correct for these artifacts including whole-brain realistic phantoms. Additionally, we design and implement an image segmentation and registration method to avoid performing the correction task and to enable processing in the native space of diffusion data. We release PySDCev, an evaluation framework for the quality control of connectivity pipelines, specialized in the study of susceptibility-derived distortions. In this context, we propose Diffantom, a whole-brain phantom that provides a solution to the lack of gold-standard data. The three correction methodologies under comparison performed reasonably, and it is difficult to determine which method is more advisable. We demonstrate that susceptibility-derived correction is necessary to increase the sensitivity of connectivity pipelines, at the cost of specificity. Finally, with the registration and segmentation tool called regseg we demonstrate how the problem of susceptibility-derived distortion can be overcome allowing data to be used in their original coordinates. This is crucial to increase the sensitivity of the whole pipeline without any loss in specificity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Si hubiese un denominador común entre todas las artes en lo que ha venido llamándose postmodernidad, éste tendría mucho que ver con el final del origen de la obra. Desde la literatura y la música hasta las artes plásticas y la arquitectura, la superación de la modernidad ha estado caracterizada por la sustitución del concepto de creación por el de intervención artística, o lo que es lo mismo, la interpretación de lo que ya existe. A principios del siglo XX los conceptos modernos de creación y origen implicaban tener que desaprender y olvidar todo lo anterior con el ánimo de partir desde cero; incluso en un sentido material Mies sugería la construcción literal de la materia y su movimiento de acuerdo a unas leyes. A partir de la segunda mitad de siglo los planteamientos historicistas empezaron a surgir como reacción ante la amnesia y la supuesta originalidad de los modernos. En este contexto surgen los libros Learning from Las Vegas, 1972 y Delirious New York, 1978, ambos deudores en muchos aspectos con el anterior libro de Venturi, Complexity and Contradiction in Architecture, 1966. Estos dos libros sobre ciudades, alejándose decididamente de las tendencias historicistas de la época, proponían utilizar el análisis crítico de la realidad existente como vehículo para la teoría y el proyecto de manera simultánea, convirtiéndose indirectamente en Manifiestos. Si en un primer momento Venturi, Rossi y otros planteaban acabar con los límites formales establecidos por la modernidad, así como por cualquiera de los cánones anteriores, tomando la totalidad de la obra construida como sistema de referencia, - al igual que hiciera Eliot en literatura, - los libros de Las Vegas y Nueva York sugerían directamente borrar los límites de la propia disciplina, llegando a poner en duda ¿Qué puede ser considerado arquitectura? Sin embargo, debido precisamente a la ausencia total de límites y a la inmensidad del sistema referencial planteado, “todo puede ser arquitectura”, como apuntaba Hans Hollein en 1968, los libros proponen al mismo tiempo definir el campo de actuación de cada cual de manera individual. Los escritos sobre Las Vegas y Nueva York suponen por un lado la eliminación de los limites disciplinares y por otro, la delimitación de ámbitos de trabajo concretos para sus autores: los propios de cada una de las ciudades interpretadas. La primera parte de la Tesis, Lecciones, se ocupa del necesario proceso de aprendizaje y experimentación previo a la acción crítica propiamente dicha. Los arquitectos contemporáneos necesitan acumular material, conocimiento, documentación, experiencias... antes de lanzarse a proponer mediante la crítica y la edición; y al contrario que ocurría con los modernos, cuanto más abundante sea ese bagaje previo más rica será la interpretación. Las ciudades de Roma, Londres y Berlín se entienden por tanto como experiencias capaces de proporcionar a Venturi, Scott Brown y Koolhaas respectivamente, sus “personales diccionarios”, unas interminables imaginerías con las que posteriormente se enfrentarían a los análisis de Las Vegas y Nueva York. La segunda parte, Críticas, se centra en la producción teórica en sí: los dos libros de ciudades analizados en estrecha relación con el Complexity and Contradiction. El razonamiento analógico característico de estos libros ha servido de guía metodológica para la investigación, estableciéndose relaciones, no entre los propios escritos directamente, sino a través de trabajos pertenecientes a otras disciplinas. En primer lugar se plantea un importante paralelismo entre los métodos de análisis desarrollados en estos libros y los utilizados por la crítica literaria, observando que si el new criticism y el nuevo periodismo sirvieron de guía en los escritos de Venturi y Scott Brown, la nouvelle critique y su propuesta de identificación poética fueron el claro referente de Koolhaas al abordar Nueva York. Por otro lado, la relevancia ganada por la actividad de comisariado artístico y la aparición de la figura del curator, como autoridad capaz de utilizar la obra de arte por encima de las intenciones de su propio autor, sirve, al igual que la figura del editor, como reflejo de la acción transformadora y de apropiación llevada a cabo tanto en Learning from Las Vegas, como en Delirious New York. Por último y a lo largo de toda la investigación las figuras de Bergson y Baudelaire han servido como apoyo teórico. A través de la utilización que de sus ideas hicieron Venturi y Koolhaas respectivamente, se ha tratado de mostrar la proximidad de ambos planteamientos desde un punto de vista ideológico. La Inclusión propuesta por Venturi y la ironía utilizada por Koolhaas, la contradicción y la paradoja, no son sino el reflejo de lógicas que en ambos casos reaccionan al mismo tiempo contra idealismo y materialismo, contra modernidad y antimodernidad, en un continuo intento de ser lo uno y lo otro simultáneamente. ABSTRACT If there was a common denominator among all the arts in what has been called postmodernism, it would have much to do with the end of the origin of the artwork. From literature and music to fine arts and architecture, overcoming modernity has been characterized by replacing the concept of artistic creation by the one of intervention, in other words, the interpretation of what already exists. In the early twentieth century modern concepts of creation and origin involved unlearning and forgetting everything before with the firm intention of starting from scratch. Even in a material sense Mies suggested the literal construction of matter and its motion according to laws. From the mid-century historicist approaches began to emerge in response to the amnesia and originality alleged by moderns. In this context appeared the books Learning from Las Vegas, 1972 and Delirious New York, 1978, both debtors in many respects to the previous book by Venturi, Complexity and Contradiction in Architecture, 1966. These two books on cities, which broke away decidedly with the historicist trends of the time, proposed using critical analysis of the existing reality as a vehicle for theory and projecting at the same time, indirectly becoming manifests. If at first Venturi, Rossi and others pose to erase the formal limits set by modernity, as well as any of the canons before, taking the entire work built as a reference system, - as did Eliot in literature - the books on Las Vegas and New York proposed directly erasing the boundaries of the discipline itself, coming to question what could be considered architecture? However, and precisely because of the absence of limits and the immensity of the established framework, - “everything could be architecture” as Hans Hollein pointed in 1968, - the books suggested at the same time the definition of a field of action for each one individually. The cities of Las Vegas and New York represented on the one hand the elimination of disciplinary limits and on the other, the delimitation of specific areas of work to its authors: Those on each of the cities interpreted. The first part of the thesis, Lessons, attend to the necessary process of learning and experimentation before the critical action itself. Contemporary architects need to accumulate material, knowledge, information, experiences... before proposing through criticism and editing; and unlike happened with moderns, the most abundant this prior baggage is, the richest will be the interpretation. Rome, London and Berlin are therefore understood as experiences capable of providing Venturi, Scott Brown and Koolhaas respectively, their “personal dictionaries”, interminable imageries with which they would later face the analysis of Las Vegas and New York. The second part, Critiques, focuses on the theoretical production itself: the two books on both cities analyzed closely with the Complexity and Contradiction. The analogical reasoning characteristic of these books has served as a methodological guide for the research, establishing relationships, not directly between the writings themselves, but through works belonging to other disciplines. First, an important parallel is set between the methods of analysis developed in these books and those used by literary criticism, noting that if the new criticism and new journalism guided Venturi and Scott Brown´s writings, the nouvelle critique and its poetic identification were clear references for Koolhaas when addressing New York. On the other hand, the relevance gained by curating and the understanding of the figure of the curator as an authority capable to use artworks above the intentions of their authors, like the one of the Editor, reflects the appropriation and processing actions carried out both in Learning from Las Vegas, and Delirious New York. Finally and over all the research Bergson and Baudelaire figures resonate continuously. Through the use of their ideas done by Venturi and Koolhaas respectively, the research has tried to show the proximity of both approaches from an ideological point of view. Inclusion, as posed by Venturi and irony, as used by Koolhaas, contradiction and paradox are reflections of the logic that in both cases allow them to react simultaneously against idealism and materialism, against modernism and anti-modernism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The biogenesis of peptide hormone secretory granules involves a series of sorting, modification, and trafficking steps that initiate in the trans-Golgi and trans-Golgi network (TGN). To investigate their temporal order and interrelationships, we have developed a pulse–chase protocol that follows the synthesis and packaging of a sulfated hormone, pro-opiomelanocortin (POMC). In AtT-20 cells, sulfate is incorporated into POMC predominantly on N-linked endoglycosidase H-resistant oligosaccharides. Subcellular fractionation and pharmacological studies confirm that this sulfation occurs at the trans-Golgi/TGN. Subsequent to sulfation, POMC undergoes a number of molecular events before final storage in dense-core granules. The first step involves the transfer of POMC from the sulfation compartment to a processing compartment (immature secretory granules, ISGs): Inhibiting export of pulse-labeled POMC by brefeldin A (BFA) or a 20°C block prevents its proteolytic conversion to mature adrenocorticotropic hormone. Proteolytic cleavage products were found in vesicular fractions corresponding to ISGs, suggesting that the processing machinery is not appreciably activated until POMC exits the sulfation compartment. A large portion of the labeled hormone is secreted from ISGs as incompletely processed intermediates. This unregulated secretory process occurs only during a limited time window: Granules that have matured for 2 to 3 h exhibit very little unregulated release, as evidenced by the efficient storage of the 15-kDa N-terminal fragment that is generated by a relatively late cleavage event within the maturing granule. The second step of granule biogenesis thus involves two maturation events: proteolytic activation of POMC in ISGs and a transition of the organelle from a state of high unregulated release to one that favors intracellular storage. By using BFA, we show that the two processes occurring in ISGs may be uncoupled: although the unregulated secretion from ISGs is impaired by BFA, proteolytic processing of POMC within this organelle proceeds unaffected. The finding that BFA impairs constitutive secretion from both the TGN and ISGs also suggests that these secretory processes may be related in mechanism. Finally, our data indicate that the unusually high levels of unregulated secretion often associated with endocrine tumors may result, at least in part, from inefficient storage of secretory products at the level of ISGs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced glycation endproducts (AGEs) are derivatives of nonenzymatic reactions between sugars and protein or lipids, and together with AGE-specific receptors are involved in numerous pathogenic processes associated with aging and hyperglycemia. Two of the known AGE-binding proteins isolated from rat liver membranes, p60 and p90, have been partially sequenced. We now report that the N-terminal sequence of p60 exhibits 95% identity to OST-48, a 48-kDa member of the oligosaccharyltransferase complex found in microsomal membranes, while sequence analysis of p90 revealed 73% and 85% identity to the N-terminal and internal sequences, respectively, of human 80K-H, a 80- to 87-kDa protein substrate for protein kinase C. AGE-ligand and Western analyses of purified oligosaccharyltransferase complex, enriched rough endoplasmic reticulum, smooth endoplasmic reticulum, and plasma membranes from rat liver or RAW 264.7 macrophages yielded a single protein of approximately 50 kDa recognized by both anti-p60 and anti-OST-48 antibodies, and also exhibited AGE-specific binding. Immunoprecipitated OST-48 from rat rough endoplasmic reticulum fractions exhibited both AGE binding and immunoreactivity to an anti-p60 antibody. Immune IgG raised to recombinant OST-48 and 80K-H inhibited binding of AGE-bovine serum albumin to cell membranes in a dose-dependent manner. Immunostaining and flow cytometry demonstrated the surface expression of OST-48 and 80K-H on numerous cell types and tissues, including mononuclear, endothelial, renal, and brain neuronal and glial cells. We conclude that the AGE receptor components p60 and p90 are identical to OST-48, and 80K-H, respectively, and that they together contribute to the processing of AGEs from extra- and intracellular compartments and in the cellular responses associated with these pathogenic substances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nocturnal melatonin production in the pineal gland is under the control of norepinephrine released from superior cervical ganglia afferents in a rhythmic manner, and of cyclic AMP. Cyclic AMP increases the expression of serotonin N-acetyltransferase and of inducible cAMP early repressor that undergo circadian oscillations crucial for the maintenance and regulation of the biological clock. In the present study, we demonstrate a circadian pattern of expression of the calcium/calmodulin activated adenylyl cyclase type 1 (AC1) mRNA in the rat pineal gland. In situ hybridization revealed that maximal AC1 mRNA expression occurred at midday (12:00-15:00), with a very low signal at night (0:00-3:00). We established that this rhythmic pattern was controlled by the noradrenergic innervation of the pineal gland and by the environmental light conditions. Finally, we observed a circadian responsiveness of the pineal AC activity to calcium/calmodulin, with a lag due to the processing of the protein. At midday, AC activity was inhibited by calcium (40%) either in the presence or absence of calmodulin, while at night the enzyme was markedly (3-fold) activated by the calcium-calmodulin complex. These findings suggest (i) the involvement of AC1 acting as the center of a gating mechanism, between cyclic AMP and calcium signals, important for the fine tuning of the pineal circadian rhythm; and (ii) a possible regulation of cyclic AMP on the expression of AC1 in the rat pineal gland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many applications including object reconstruction, robot guidance, and. scene mapping require the registration of multiple views from a scene to generate a complete geometric and appearance model of it. In real situations, transformations between views are unknown and it is necessary to apply expert inference to estimate them. In the last few years, the emergence of low-cost depth-sensing cameras has strengthened the research on this topic, motivating a plethora of new applications. Although they have enough resolution and accuracy for many applications, some situations may not be solved with general state-of-the-art registration methods due to the signal-to-noise ratio (SNR) and the resolution of the data provided. The problem of working with low SNR data, in general terms, may appear in any 3D system, then it is necessary to propose novel solutions in this aspect. In this paper, we propose a method, μ-MAR, able to both coarse and fine register sets of 3D points provided by low-cost depth-sensing cameras, despite it is not restricted to these sensors, into a common coordinate system. The method is able to overcome the noisy data problem by means of using a model-based solution of multiplane registration. Specifically, it iteratively registers 3D markers composed by multiple planes extracted from points of multiple views of the scene. As the markers and the object of interest are static in the scenario, the transformations obtained for the markers are applied to the object in order to reconstruct it. Experiments have been performed using synthetic and real data. The synthetic data allows a qualitative and quantitative evaluation by means of visual inspection and Hausdorff distance respectively. The real data experiments show the performance of the proposal using data acquired by a Primesense Carmine RGB-D sensor. The method has been compared to several state-of-the-art methods. The results show the good performance of the μ-MAR to register objects with high accuracy in presence of noisy data outperforming the existing methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three types of tephra deposits were recovered on Leg 65 of the Deep Sea Drilling Project (DSDP) from three drill sites at the mouth of the Gulf of California: (1) a series of white ash layers at Sites 483, 484, and 485; (2) a layer of plagioclase- phyric sideromelane shards at Site 483; and (3) an indurated, cross-bedded hyaloclastite in Hole 483B. The ash layers in (1) are composed of colorless, fresh rhyolitic glass shards with minor dacitic and rare basaltic shards. These are thought to be derived from explosive volcanoes on the Mexican mainland. Most of the shards in (2) are fresh, but some show marginal to complete alteration to palagonite. The composition of the glass is that of a MORB-type tholeiite, low in Fe and moderately high in Ti, and possibly erupted from off-axis seamounts. Basaltic glass shards occurring in silt about 45 meters above the basement at Site 484 A in the Tamayo Fracture Zone show a distinctly alkalic composition similar to that of the single basement basalt specimen drilled at this site. The hyaloclastite in (3) is made up chiefly of angular sideromelane shards altered to smectite and zeolites (mainly phillipsite) and minor admixtures of terrigenous silt. A very high K and Ba content indicates significant uptake of at least these elements from seawater. Nevertheless, the unusual chemical composition of the underlying massive basalt flow is believed to be reflected in that of the hyaloclastite. This is a powerful argument for interpreting the massive basalt as a surface flow rather than an intrusion. Glass alteration is different in the glassy margins of flows than in thicker glassy pillow rinds. Also, it appears to proceed faster in coarse- than fine-grained sediments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lake Ohrid is probably of Pliocene age, and the oldest extant lake in Europe. In this study climatic and environmental changes during the last glacial-interglacial cycle are reconstructed using lithological, sedimentological, geochemical and physical proxy analysis of a 15-m-long sediment succession from Lake Ohrid. A chronological framework is derived from tephrochronology and radiocarbon dating, which yields a basal age of ca. 136 ka. The succession is not continuous, however, with a hiatus between ca. 97.6 and 81.7 ka. Sediment accumulation in course of the last climatic cycle is controlled by the complex interaction of a variety of climate-controlled parameters and their impact on catchment dynamics, limnology, and hydrology of the lake. Warm interglacial and cold glacial climate conditions can be clearly distinguished from organic matter, calcite, clastic detritus and lithostratigraphic data. During interglacial periods, short-term fluctuations are recorded by abrupt variations in organic matter and calcite content, indicating climatically-induced changes in lake productivity and hydrology. During glacial periods, high variability in the contents of coarse silt to fine sand sized clastic matter is probably a function of climatically-induced changes in catchment dynamics and wind activity. In some instances tephra layers provide potential stratigraphic markers for short-lived climate perturbations. Given their widespread distribution in sites across the region, tephra analysis has the potential to provide insight into variation in the impact of climate and environmental change across the Mediterranean.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sandfraction of the sediment was analysed in five cores, taken from 65 m water depth in the central and eastern part of the Persian Gulf. The holocene marls are underlayn by aragonite muds, which are probably 10-11,000 years old. 1. The cores could be subdivided into coarse grained and fine grained layers. Sorting is demonstrated by the following criteria: With increasing median values of the sandfraction - the fine grained fraction decreases within each core; - the median of each biogenic component, benthonic as well as planktonic, increases; - the median of the relict sediment, which in core 1179 was carried upward into the marl by bioturbation, increases; - the percentages of pelecypods, gastropods, decapods and serpulid worms in the sandfraction increase, the percentages of foraminifera and ostracods decrease; - the ratios of pelecypods to foraminifera and of decapods to ostracods increase; - the ratios of benthonic molluscs to planktonic molluscs (pteropods) and of benthonic foraminifera to planktonic foraminifera increase (except in core 1056 and 1179); - the ratio of planktonic molluscs (pteropods) to planktonic foraminifera increases; - the globigerinas without orbulinas increase, the orbulinas decrease in core 1056. Different settling velocities of these biogenic particles help in better understanding the results : the settling velocities, hence the equivalent hydrodynamic diameters, of orbulinas are smaller than those of other globigerinas, those of planktonic foraminifera are smaller than those of planktonic molluscs, those of planktonic molluscs are smaller than those of benthonic molluscs, those of pelecypods are smaller than those of gastropods. Bioturbation could not entirely distroy this "grain-size-stratification". Sorting has been stronger in the coarse layers than in the finer ones. As a cause variations in the supply of terrigenous material at constant strength of tidal currents is suggested. When much terrigenous material is supplied (large contents of fine grained fraction) the sedimentation rates are high: the respective sediment surface is soon covered and removed from the influence of tidal currents. When, however, the supply of terrigenous material is small, more sandy material is taken away in all locations within the influence of terrigenous supply. Thus the biogenic particles in the sediment do not only reflect the organic production, but also the influence of currents. 2. There is no parameter present in all cores that is independently variable from grain size and can be used for stratigraphic correlation. The two cores from the Strait of Hormus were correlated by their sequences of coarse and fine grained layers. 3. The sedimentation rates of terrigenous material, of total planktonic and benthonic organisms and of molluscs, foraminifera, echinoids and ophiuroids are shown in table 1 (total sediment 6.3-75.5 cm/1000 yr, biogenic carbonate 1.9-3.6 cm/1000 yr). The sedimentation rates of benthonic organisms are nearly the same in the cores of the Strait of Hormus, whereas near the Central Swell they are smaller. In the upper parts of the two cores of the Strait of Hormus sedimentation rates are higher than in the deeper parts, where higher median values point to stronger reworking. 4. The sequence of coarse and fine grained intervals in the two cores of the Hormus Strait, attributed to variations in climate, as well as the increase of terrigenous supply from the deeper to the upper parts of the cores, agrees with the descriptions in the literature of the post Pleistocene climate as becoming more humid. The rise of sea level is sedimentologically not measurable in the marly sediments - except perhaps for the higher content of echinoids in the lower part of core 1056. These may be attributed to the influence of a migrating wave-base. 5. The late Pleistocene aragonite mud is very fine grained (> 50%< 2 p) and poor in fossils (0.5-1.8%) biogenic particles of total sediment. The sand fraction consists almost entirely of white clumps, c. 0.1 mm in diameter (1177), composed of aragonite needles and of detrital minerals with the same size (1201). The argonite mud was probably not formed in situ, because the water depth at time of formation was at most 35 m at least 12 m. The sorting of the sediment (predominance of the fine grained sand), the absence of larger biogenic components and of pellets, c. 0.2-0.5 mm in diameter, which are typical for Recent and Pleistocene locations of aragonite formation, as well as the sedimentological conditions near the sampling points, indicate rather a transport of aragonite mud from an area of formation in very shallow waters. Sorting as well as lenticular fabric in core 1201 point to sedimentation within the influence of currents. During alternating sedimentation - and reworking processes the aragonitic matrix was separated from the silt - and sand-sized minerals. The lenses grade into touches because of bioturbation. 6. In core 1056 D2 from Hormus Bay the percentages of organic carbon, total nitrogen and total carbonate were determined. With increasing amounts of smaller grain sizes the content of organic matter increases, whereas the amount of carbonate decreases. The amounts of organic carbon and of nitrogen decrease with increasing depth, probably due to early-diagenetic decomposition processes. Most of the total nitrogen is of organic origin, only about 10% may well be inorganically fixed as ammonium-nitrogen. In the upper part of the core the C/N-ratio increases with increasing depth. This may be connected with a stronger decomposition of nitrogen-containing organic compounds. The general decrease of the C/N-ratios in the lower part of the core may be explained by the relative increase of inorganically fixed ammonium-nitrogen with decreasing content of organic matter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two experiments investigated the extent of message processing of a persuasive communication proposed by either a numerical majority or minority. Both experiments crossed source status (majority versus minority) with message quality (strong versus weak arguments) to determine which source condition is associated with systematic processing. The first experiment showed a reliable difference between strong and weak messages, indicating systematic processing had occurred, for a minority irrespective of message direction (pro- versus counter-attitudinal), but not for a majority. The second experiment showed that message outcome moderates when a majority or a minority leads to systematic processing. When the message argued for a negative personal outcome, there was systematic processing only for the majority source; but when the message did not argue for a negative personal outcome, there was systematic processing only for the minority source. Thus one key moderator of whether a majority or minority source leads to message processing is whether the topic induces defensive processing motivated by self-interest. Copyright (C) 2002 John Wiley Sons, Ltd.