945 resultados para Fluid and crystallized Intelligence


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purely data-driven approaches for machine learning present difficulties when data are scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (which may not be feasible) and still leave the issue of how to parameterize the system. In this paper, we present a hybrid approach using Gaussian processes and differential equations to combine data-driven modeling with a physical model of the system. We show how different, physically inspired, kernel functions can be developed through sensible, simple, mechanistic assumptions about the underlying system. The versatility of our approach is illustrated with three case studies from motion capture, computational biology, and geostatistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La aplicación de la vigilancia tecnología se ha mostrado como una de las herramientas más importantes para ganar competitividad y mejorar las actividades de innovación de las empresas. La vigilancia se basa en captar las informaciones, normalmente patentes y publicaciones científicas, más relevantes para un determinado campo tecnológico y valorarlas para influir en la toma de decisiones. Las fases clásicas son: búsqueda, análisis y comunicación de la información. El trabajo se soportará tanto en herramientas comerciales como en otras que deberá desarrollar el estudiante, y tendrá como objetivo fundamental el desarrollar una metodología, basada en la vigilancia, para tomar decisiones sobre tecnologías y en este caso aplicadas a las tecnologías multimedia. El objetivo principal en la propuesta de una metodología genérica de vigilancia tecnológica (VT/IC) para la toma de decisiones con un ejemplo de aplicación en las tecnologías multimedia y que más adelante se explicitó en TV 3D. La necesidad de que el proceso de VT/IC se soporte en una metodología es imprescindible si queremos darle la importancia que debe tener en el ciclo productivo de cualquier tipo de organización y muy especialmente en una organización involucrada en investigación y desarrollo (I+D+i). Esta metodología posibilitará, entre otras cosas, que estos procesos que conforman la VT/IC puedan integrarse en una organización compartiendo los procesos productivos, de administración y de dirección de la organización. Permitirá una medición de su funcionamiento y las posibles modificaciones para obtener un mejor funcionamiento. Proveerá a los posibles elementos involucrados en la VT/IC de la documentación, procesos, herramientas, elementos de medición de un sistema definido, publicado, medido y analizado de trabajo. Finalmente a modo de ejemplo de un proceso de consulta VT/IC utilizaremos el criterio de búsqueda genérico 3D TV propuesto. Estructura del PFC: Para lograr estos objetivos el trabajo ha sido dividido en 6 etapas: 1.- Descripción del PFC: Una presentación del PFC y su desarrollo. 2.- Vigilancia tecnológica: Desarrollo del concepto de VT/IC, los efectos esperados, beneficios y riesgos de la VT/IC, concepto de inteligencia competitiva (IC), concepto aplicado de vigilancia tecnológica e inteligencia competitiva (VT/IC). 3.- Técnicas de análisis empresarial donde la VT/IC es útil: para empezar a entender como debe ser la VT/IC analizamos como una organización utiliza las distintas técnicas de análisis y que información aportan cada una de ellas, finalmente analizamos como la VT/IC ayuda a esas técnicas de análisis. 4.- Gestión de las fuentes de información: análisis de los tipos de fuentes de información y sus herramientas de búsqueda asociadas. 5.- Metodología propuesta de la VT/IC: desarrollo de la metodología de implementación y de funcionamiento de una unidad de VT/IC. 6.- Observatorio: a modo de ejemplo, “3d TV”. ABSTRACT. The application of surveillance technology has proven to be one of the most important to increase competitiveness and improve the innovation activities of enterprises tools. Surveillance is based on capturing the information, usually patents and scientific publications most relevant to a given technological field and assess them to influence decision making. The classical phases are: search, analysis and communication of information. The work will support both commercial and other tools to be developed by the student, and will have as main objective to develop a methodology, based on monitoring to make decisions about technologies and in this case applied to multimedia technologies. The main objective in the proposed generic methodology for technological awareness (VT / IC) for decision making with an example application in multimedia technologies and later made explicit 3D TV. The need for the process of VT / CI support methodology is essential if we give it the importance it should have in the production cycle of any organization and especially in an organization involved in research and development (R + D + i). This methodology will allow, among other things, that these processes that make up the VT / IC can be integrated into an organization sharing production processes, management and direction of the organization. It will allow a measurement of its performance and possible changes for better performance. It will provide the possible elements involved in the VT / IC documentation, processes, tools, measuring elements of a defined system, published, measured and analyzed work. Finally an example of a consultation process VT / IC use generic search criteria proposed 3D TV. Structure of the PFC: To achieve these objectives the work has been divided into 6 stages: 1. PFC Description: A presentation of the PFC and its development. 2. Technology Watch: Concept Development of VT / IC, expected effects, benefits and risks of VT / IC concept of competitive intelligence (CI) concept applied technology watch and competitive intelligence (VT / IC). 3. Business analysis techniques where VT / IC is useful: to begin to understand how it should be the VT / IC analyze how an organization uses different analysis techniques and information provide each finally analyze how the VT / IC helps these analysis techniques. 4. Management information sources: analysis of the types of information sources and their associated search tools. 5. proposed methodology VT / IC: methodology development and operational deployment of a unit of VT / IC. 6. Observatory: by way of example, "3D TV".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A nivel mundial, el cáncer de mama es el tipo de cáncer más frecuente además de una de las principales causas de muerte entre la población femenina. Actualmente, el método más eficaz para detectar lesiones mamarias en una etapa temprana es la mamografía. Ésta contribuye decisivamente al diagnóstico precoz de esta enfermedad que, si se detecta a tiempo, tiene una probabilidad de curación muy alta. Uno de los principales y más frecuentes hallazgos en una mamografía, son las microcalcificaciones, las cuales son consideradas como un indicador importante de cáncer de mama. En el momento de analizar las mamografías, factores como la capacidad de visualización, la fatiga o la experiencia profesional del especialista radiólogo hacen que el riesgo de omitir ciertas lesiones presentes se vea incrementado. Para disminuir dicho riesgo es importante contar con diferentes alternativas como por ejemplo, una segunda opinión por otro especialista o un doble análisis por el mismo. En la primera opción se eleva el coste y en ambas se prolonga el tiempo del diagnóstico. Esto supone una gran motivación para el desarrollo de sistemas de apoyo o asistencia en la toma de decisiones. En este trabajo de tesis se propone, se desarrolla y se justifica un sistema capaz de detectar microcalcificaciones en regiones de interés extraídas de mamografías digitalizadas, para contribuir a la detección temprana del cáncer demama. Dicho sistema estará basado en técnicas de procesamiento de imagen digital, de reconocimiento de patrones y de inteligencia artificial. Para su desarrollo, se tienen en cuenta las siguientes consideraciones: 1. Con el objetivo de entrenar y probar el sistema propuesto, se creará una base de datos de imágenes, las cuales pertenecen a regiones de interés extraídas de mamografías digitalizadas. 2. Se propone la aplicación de la transformada Top-Hat, una técnica de procesamiento digital de imagen basada en operaciones de morfología matemática. La finalidad de aplicar esta técnica es la de mejorar el contraste entre las microcalcificaciones y el tejido presente en la imagen. 3. Se propone un algoritmo novel llamado sub-segmentación, el cual está basado en técnicas de reconocimiento de patrones aplicando un algoritmo de agrupamiento no supervisado, el PFCM (Possibilistic Fuzzy c-Means). El objetivo es encontrar las regiones correspondientes a las microcalcificaciones y diferenciarlas del tejido sano. Además, con la finalidad de mostrar las ventajas y desventajas del algoritmo propuesto, éste es comparado con dos algoritmos del mismo tipo: el k-means y el FCM (Fuzzy c-Means). Por otro lado, es importante destacar que en este trabajo por primera vez la sub-segmentación es utilizada para detectar regiones pertenecientes a microcalcificaciones en imágenes de mamografía. 4. Finalmente, se propone el uso de un clasificador basado en una red neuronal artificial, específicamente un MLP (Multi-layer Perceptron). El propósito del clasificador es discriminar de manera binaria los patrones creados a partir de la intensidad de niveles de gris de la imagen original. Dicha clasificación distingue entre microcalcificación y tejido sano. ABSTRACT Breast cancer is one of the leading causes of women mortality in the world and its early detection continues being a key piece to improve the prognosis and survival. Currently, the most reliable and practical method for early detection of breast cancer is mammography.The presence of microcalcifications has been considered as a very important indicator ofmalignant types of breast cancer and its detection and classification are important to prevent and treat the disease. However, the detection and classification of microcalcifications continue being a hard work due to that, in mammograms there is a poor contrast between microcalcifications and the tissue around them. Factors such as visualization, tiredness or insufficient experience of the specialist increase the risk of omit some present lesions. To reduce this risk, is important to have alternatives such as a second opinion or a double analysis for the same specialist. In the first option, the cost increases and diagnosis time also increases for both of them. This is the reason why there is a great motivation for development of help systems or assistance in the decision making process. This work presents, develops and justifies a system for the detection of microcalcifications in regions of interest extracted fromdigitizedmammographies to contribute to the early detection of breast cancer. This systemis based on image processing techniques, pattern recognition and artificial intelligence. For system development the following features are considered: With the aim of training and testing the system, an images database is created, belonging to a region of interest extracted from digitized mammograms. The application of the top-hat transformis proposed. This image processing technique is based on mathematical morphology operations. The aim of this technique is to improve the contrast betweenmicrocalcifications and tissue present in the image. A novel algorithm called sub-segmentation is proposed. The sub-segmentation is based on pattern recognition techniques applying a non-supervised clustering algorithm known as Possibilistic Fuzzy c-Means (PFCM). The aim is to find regions corresponding to the microcalcifications and distinguish them from the healthy tissue. Furthermore,with the aim of showing themain advantages and disadvantages this is compared with two algorithms of same type: the k-means and the fuzzy c-means (FCM). On the other hand, it is important to highlight in this work for the first time the sub-segmentation is used for microcalcifications detection. Finally, a classifier based on an artificial neural network such as Multi-layer Perceptron is used. The purpose of this classifier is to discriminate froma binary perspective the patterns built from gray level intensity of the original image. This classification distinguishes between microcalcifications and healthy tissue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambient Intelligence could support innovative application domains like motor impairments' detection at the home environment. This research aims to prevent neurodevelopmental disorders through the natural interaction of the children with embedded intelligence daily life objects, like home furniture and toys. Designed system uses an interoperable platform to provide two intelligent interrelated home healthcare services: monitoring of children¿s abilities and completion of early stimulation activities. A set of sensors, which are embedded within the rooms, toys and furniture, allows private data gathering about the child's interaction with the environment. This information feeds a reasoning subsystem, which encloses an ontology of neurodevelopment items, and adapts the service to the age and acquisition of expected abilities. Next, the platform proposes customized stimulation services by taking advantage of the existing facilities at the child's environment. The result integrates Embedded Sensor Systems for Health at Mälardalen University with UPM Smart Home, for adapted services delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a robust approach for recognition of thermal face images based on decision level fusion of 34 different region classifiers. The region classifiers concentrate on local variations. They use singular value decomposition (SVD) for feature extraction. Fusion of decisions of the region classifier is done by using majority voting technique. The algorithm is tolerant against false exclusion of thermal information produced by the presence of inconsistent distribution of temperature statistics which generally make the identification process difficult. The algorithm is extensively evaluated on UGC-JU thermal face database, and Terravic facial infrared database and the recognition performance are found to be 95.83% and 100%, respectively. A comparative study has also been made with the existing works in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A first-rate e-Health system saves lives, provides better patient care, allows complex but useful epidemiologic analysis and saves money. However, there may also be concerns about the costs and complexities associated with e-health implementation, and the need to solve issues about the energy footprint of the high-demanding computing facilities. This paper proposes a novel and evolved computing paradigm that: (i) provides the required computing and sensing resources; (ii) allows the population-wide diffusion; (iii) exploits the storage, communication and computing services provided by the Cloud; (iv) tackles the energy-optimization issue as a first-class requirement, taking it into account during the whole development cycle. The novel computing concept and the multi-layer top-down energy-optimization methodology obtain promising results in a realistic scenario for cardiovascular tracking and analysis, making the Home Assisted Living a reality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electric probes are objects immersed in the plasma with sharp boundaries which collect of emit charged particles. Consequently, the nearby plasma evolves under abrupt imposed and/or naturally emerging conditions. There could be localized currents, different time scales for plasma species evolution, charge separation and absorbing-emitting walls. The traditional numerical schemes based on differences often transform these disparate boundary conditions into computational singularities. This is the case of models using advection-diffusion differential equations with source-sink terms (also called Fokker-Planck equations). These equations are used in both, fluid and kinetic descriptions, to obtain the distribution functions or the density for each plasma species close to the boundaries. We present a resolution method grounded on an integral advancing scheme by using approximate Green's functions, also called short-time propagators. All the integrals, as a path integration process, are numerically calculated, what states a robust grid-free computational integral method, which is unconditionally stable for any time step. Hence, the sharp boundary conditions, as the current emission from a wall, can be treated during the short-time regime providing solutions that works as if they were known for each time step analytically. The form of the propagator (typically a multivariate Gaussian) is not unique and it can be adjusted during the advancing scheme to preserve the conserved quantities of the problem. The effects of the electric or magnetic fields can be incorporated into the iterative algorithm. The method allows smooth transitions of the evolving solutions even when abrupt discontinuities are present. In this work it is proposed a procedure to incorporate, for the very first time, the boundary conditions in the numerical integral scheme. This numerical scheme is applied to model the plasma bulk interaction with a charge-emitting electrode, dealing with fluid diffusion equations combined with Poisson equation self-consistently. It has been checked the stability of this computational method under any number of iterations, even for advancing in time electrons and ions having different time scales. This work establishes the basis to deal in future work with problems related to plasma thrusters or emissive probes in electromagnetic fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En esta tesis se estudia la representación, modelado y comparación de colecciones mediante el uso de ontologías en el ámbito de la Web Semántica. Las colecciones, entendidas como agrupaciones de objetos o elementos con entidad propia, son construcciones que aparecen frecuentemente en prácticamente todos los dominios del mundo real, y por tanto, es imprescindible disponer de conceptualizaciones de estas estructuras abstractas y de representaciones de estas conceptualizaciones en los sistemas informáticos, que definan adecuadamente su semántica. Mientras que en muchos ámbitos de la Informática y la Inteligencia Artificial, como por ejemplo la programación, las bases de datos o la recuperación de información, las colecciones han sido ampliamente estudiadas y se han desarrollado representaciones que responden a multitud de conceptualizaciones, en el ámbito de la Web Semántica, sin embargo, su estudio ha sido bastante limitado. De hecho hasta la fecha existen pocas propuestas de representación de colecciones mediante ontologías, y las que hay sólo cubren algunos tipos de colecciones y presentan importantes limitaciones. Esto impide la representación adecuada de colecciones y dificulta otras tareas comunes como la comparación de colecciones, algo crítico en operaciones habituales como las búsquedas semánticas o el enlazado de datos en la Web Semántica. Para solventar este problema esta tesis hace una propuesta de modelización de colecciones basada en una nueva clasificación de colecciones de acuerdo a sus características estructurales (homogeneidad, unicidad, orden y cardinalidad). Esta clasificación permite definir una taxonomía con hasta 16 tipos de colecciones distintas. Entre otras ventajas, esta nueva clasificación permite aprovechar la semántica de las propiedades estructurales de cada tipo de colección para realizar comparaciones utilizando las funciones de similitud y disimilitud más apropiadas. De este modo, la tesis desarrolla además un nuevo catálogo de funciones de similitud para las distintas colecciones, donde se han recogido las funciones de (di)similitud más conocidas y también algunas nuevas. Esta propuesta se ha implementado mediante dos ontologías paralelas, la ontología E-Collections, que representa los distintos tipos de colecciones de la taxonomía y su axiomática, y la ontología SIMEON (Similarity Measures Ontology) que representa los tipos de funciones de (di)similitud para cada tipo de colección. Gracias a estas ontologías, para comparar dos colecciones, una vez representadas como instancias de la clase más apropiada de la ontología E-Collections, automáticamente se sabe qué funciones de (di)similitud de la ontología SIMEON pueden utilizarse para su comparación. Abstract This thesis studies the representation, modeling and comparison of collections in the Semantic Web using ontologies. Collections, understood as groups of objects or elements with their own identities, are constructions that appear frequently in almost all areas of the real world. Therefore, it is essential to have conceptualizations of these abstract structures and representations of these conceptualizations in computer systems, that define their semantic properly. While in many areas of Computer Science and Artificial Intelligence, such as Programming, Databases or Information Retrieval, the collections have been extensively studied and there are representations that match many conceptualizations, in the field Semantic Web, however, their study has been quite limited. In fact, there are few representations of collections using ontologies so far, and they only cover some types of collections and have important limitations. This hinders a proper representation of collections and other common tasks like comparing collections, something critical in usual operations such as semantic search or linking data on the Semantic Web. To solve this problem this thesis makes a proposal for modelling collections based on a new classification of collections according to their structural characteristics (homogeneity, uniqueness, order and cardinality). This classification allows to define a taxonomy with up to 16 different types of collections. Among other advantages, this new classification can leverage the semantics of the structural properties of each type of collection to make comparisons using the most appropriate (dis)similarity functions. Thus, the thesis also develops a new catalog of similarity functions for the different types of collections. This catalog contains the most common (dis)similarity functions as well as new ones. This proposal is implemented through two parallel ontologies, the E-Collections ontology that represents the different types of collections in the taxonomy and their axiomatic, and the SIMEON ontology (Similarity Measures Ontology) that represents the types of (dis)similarity functions for each type of collection. Thanks to these ontologies, to compare two collections, once represented as instances of the appropriate class of E-Collections ontology, we can know automatically which (dis)similarity functions of the SIMEON ontology are suitable for the comparison. Finally, the feasibility and usefulness of this modeling and comparison of collections proposal is proved in the field of oenology, applying both E-Collections and SIMEON ontologies to the representation and comparison of wines with the E-Baco ontology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of a recently developed model of sonic anemometers measuring process has revealed that these sensors cannot be considered as absolute ones when measuring spectral characteristics of turbulent wind speed since it is demonstrated that the ratios of measured to real spectral density functions depend on the composition and temperature of the considered planetary atmosphere. The new model of the measuring process of sonic anemometers is applied to describe the measuring characteristics of these sensors as fluid/flow dependent (against the traditional hypothesis of fluid/flow independence) and hence dependent on the considered planetary atmosphere. The influence of fluid and flow characteristics (quantified via the Mach number of the flow) and the influence of the design parameters of sonic anemometers (mainly represented by time delay between pulses shots and geometry) on turbulence measurement are quantified for the atmospheres of Mars, Jupiter, and Earth. Important differences between the behavior of these sensors for the same averaged wind speed in the three considered atmospheres are detected in terms of characteristics of turbulence measurement as well as in terms of optimum values of anemometer design parameters for application on the different considered planetary atmospheres. These differences cannot be detected by traditional models of sonic anemometer measuring process based on line averaging along the sonic acoustic paths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesis doctoral “La construcción de la transparencia. Museo de Arte Contemporáneo del Siglo 21 de Kanazawa / SANAA” estudia, desde diferentes perspectivas, esta obra de los arquitectos japoneses Kazuyo Sejima + Ryue Nishizawa / SANAA. El museo, proyectado y construido entre 1999 y 2004, supone un referente arquitectónico de gran relevancia en el cambio de siglo, y está considerado un manifiesto construido en el que resuenan ecos culturales y sociales de su época. La investigación toma como constante el Museo del Siglo 21 de Kanazawa y a través de él se realizan una serie de miradas transversales a modo de narrativas interconectadas. Estas visiones múltiples colaboran, a través de las relaciones que se producen entre ellas, a generar un entendimiento global de esta obra y sus efectos sobre la cultura arquitectónica contemporánea. La tesis se estructura en dos partes: la primera estudia de manera analítica la obra en su contexto cultural, el proceso de proyecto y el edificio construido. La segunda parte propone una triple lectura interpretativa del edificio en relación a temas de especial relevancia para el proyecto y al modelo de espacio que constituye, tomando de nuevo como constante la noción de transparencia. La tesis partirá de la interpretación que hace del museo su directora artística Yuko Hasegawa, cuya colaboración activa durante el proceso de diseño resultó de gran relevancia para el resultado final. Hasegawa define la obra como la materialización de una situación cultural característica del nuevo siglo basada en tres conceptos: La Coexistencia, la Consciencia y la Inteligencia Colectiva. A partir de dichas ideas entenderemos la transparencia no sólo desde el punto de vista de las cualidades físicas y materiales del espacio sino también, desde una perspectiva más amplia: como estrategia utilizada para clarificar la organización funcional del edificio y las relaciones entre sus partes como posición política, catalizadora del espacio público; la producción de consciencia a través de la experiencia del arte y la arquitectura; y como medio de representación y generación de una identidad colectiva basada en la visibilidad mediática. Las conclusiones de esta investigación, extraídas de las confluencias y puntos de intersección de las diferentes miradas transversales proyectadas sobre la obra, determinan una clara voluntad por parte de los arquitectos de construir un espacio donde lo público se entiende como el ensamblaje de diferentes individualidades. De manera similar al proceso de proyecto, la multiplicidad y las relaciones entre elementos autónomos generan un consciencia colectiva del espacio donde, a través de la organización programática, la construcción y el uso de los materiales y la luz, la arquitectura tiene la voluntad de desvanecerse para ofrecer protagonismo a sus usuarios y a las relaciones y encuentros que se producen entre ellos y con el espacio que habitan. ABSTRACT The PhD thesis “The construction of transparency. 21st Century Museum of Contemporary Art in Kanazawa / SANAA” studies, from different perspectives, this work, realized by Japanese architects Kazuyo Sejima + Ryue Nishizawa / SANAA. The museum, designed and built between 1999 and 2004, had a huge relevance as an architectural referent in the swift of the century, being considered a built manifesto that echoes the culture and society of its time. The research takes as a constant the 21st Century Museum in Kanazawa and it is studied throughout a series of transversal readings as interconnected narratives. These multiple approaches and the relations among them, help to generate a global understanding of the building and its effects on contemporary architectural culture. The dissertation is structured in two parts: the first one studies from an analytical perspective the project in its cultural context, the process of design and the built work. The second part proposes a triple interpretative reading of the building in relation to topics that are especially relevant for the project and the spatial model that it constitutes, taking again the notion of transparency as a constant concept. The thesis departs from the interpretation expressed by the artistic director of the museum, Yuko Hasegawa, whose active collaboration during the design process had special relevance for the final result. She defines the work as the materialization of the cultural context of the new century based in three concepts: Co-existence, Consciousness and Collective Intelligence. From these concepts we will understand transparency, not only from the point of view of the physical and material qualities of the space, but also from a broader perspective: as an strategy used to clarify the functional organization of the building and the relation of its parts as political stand that catalyzes the public space; the production of consciousness based on the experience of art and architecture; and as a method of representation and construction of a collective identity based in the media visibility. The conclusions of this research -extracted from the confluences and intersections of the different transversal readings projected on the work- determine a clear intention by the architects to build a space where the public is understood as an assembly of different individualities. In a similar way as it happens in the design process, the multiplicity of relations between autonomous elements generate a collective consciousness of the space. Throughout the programmatic organization, the construction and the use of light and materials, the architecture has the will of disappearance to yield protagonist the users and the relations and encounters that happen among them and the space they inhabit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As empresas que almejam garantir e melhorar sua posição dentro de em um mercado cada vez mais competitivo precisam estar sempre atualizadas e em constante evolução. Na busca contínua por essa evolução, investem em projetos de Pesquisa & Desenvolvimento (P&D) e em seu capital humano para promover a criatividade e a inovação organizacional. As pessoas têm papel fundamental no desenvolvimento da inovação, mas para que isso possa florescer de forma constante é preciso comprometimento e criatividade para a geração de ideias. Criatividade é pensar o novo; inovação é fazer acontecer. Porém, encontrar pessoas com essas qualidades nem sempre é tarefa fácil e muitas vezes é preciso estimular essas habilidades e características para que se tornem efetivamente criativas. Os cursos de graduação podem ser uma importante ferramenta para trabalhar esses aspectos, características e habilidades, usando métodos e práticas de ensino que auxiliem no desenvolvimento da criatividade, pois o ambiente ensino-aprendizagem pesa significativamente na formação das pessoas. O objetivo deste estudo é de identificar quais fatores têm maior influência sobre o desenvolvimento da criatividade em um curso de graduação em administração, analisando a influência das práticas pedagógicas dos docentes e as barreiras internas dos discentes. O referencial teórico se baseia principalmente nos trabalhos de Alencar, Fleith, Torrance e Wechsler. A pesquisa transversal de abordagem quantitativa teve como público-alvo os alunos do curso de Administração de uma universidade confessional da Grande São Paulo, que responderam 465 questionários compostos de três escalas. Para as práticas docentes foi adaptada a escala de Práticas Docentes em relação à Criatividade. Para as barreiras internas foi adaptada a escala de Barreiras da Criatividade Pessoal. Para a análise da percepção do desenvolvimento da criatividade foi construída e validada uma escala baseada no referencial de características de uma pessoa criativa. As análises estatísticas descritivas e fatoriais exploratórias foram realizadas no software Statistical Package for the Social Sciences (SPSS), enquanto as análises fatoriais confirmatórias e a mensuração da influência das práticas pedagógicas e das barreiras internas sobre a percepção do desenvolvimento da criatividade foram realizadas por modelagem de equação estrutural utilizando o algoritmo Partial Least Squares (PLS), no software Smart PLS 2.0. Os resultados apontaram que as práticas pedagógicas e as barreiras internas dos discentes explicam 40% da percepção de desenvolvimento da criatividade, sendo as práticas pedagógicas que exercem maior influencia. A pesquisa também apontou que o tipo de temática e o período em que o aluno está cursando não têm influência sobre nenhum dos três construtos, somente o professor influencia as práticas pedagógicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Epithelial Na+ channels are expressed widely in absorptive epithelia such as the renal collecting duct and the colon and play a critical role in fluid and electrolyte homeostasis. Recent studies have shown that these channels interact via PY motifs in the C terminals of their α, β, and γ subunits with the WW domains of the ubiquitin-protein ligase Nedd4. Mutation or deletion of these PY motifs (as occurs, for example, in the heritable form of hypertension known as Liddle’s syndrome) leads to increased Na+ channel activity. Thus, binding of Nedd4 by the PY motifs would appear to be part of a physiological control system for down-regulation of Na+ channel activity. The nature of this control system is, however, unknown. In the present paper, we show that Nedd4 mediates the ubiquitin-dependent down-regulation of Na+ channel activity in response to increased intracellular Na+. We further show that Nedd4 operates downstream of Go in this feedback pathway. We find, however, that Nedd4 is not involved in the feedback control of Na+ channels by intracellular anions. Finally, we show that Nedd4 has no influence on Na+ channel activity when the Na+ and anion feedback systems are inactive. We conclude that Nedd4 normally mediates feedback control of epithelial Na+ channels by intracellular Na+, and we suggest that the increased Na+ channel activity observed in Liddle’s syndrome is attributable to the loss of this regulatory feedback system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aa3 type cytochrome c oxidase consisting of the core subunits I and II only was isolated from the soil bacterium Paracoccus denitrificans and crystallized as complex with a monoclonal antibody Fv fragment. Crystals could be grown in the presence of a number of different nonionic detergents. However, only undecyl-β-d-maltoside and cyclohexyl-hexyl-β-d-maltoside yielded well-ordered crystals suitable for high resolution x-ray crystallographic studies. The crystals belong to space group P212121 and diffract x-rays to at least 2.5 Å (1 Å = 0.1 nm) resolution using synchrotron radiation. The structure was determined to a resolution of 2.7 Å using molecular replacement and refined to a crystallographic R-factor of 20.5% (Rfree = 25.9%). The refined model includes subunits I and II and the 2 chains of the Fv fragment, 2 heme A molecules, 3 copper atoms, and 1 Mg/Mn atom, a new metal (Ca) binding site, 52 tentatively identified water molecules, and 9 detergent molecules. Only four of the water molecules are located in the cytoplasmic half of cytochrome c oxidase. Most of them are near the interface of subunits I and II. Several waters form a hydrogen-bonded cluster, including the heme propionates and the Mg/Mn binding site. The Fv fragment binds to the periplasmic polar domain of subunit II and is critically involved in the formation of the crystal lattice. The crystallization procedure is well reproducible and will allow for the analysis of the structures of mechanistically interesting mutant cytochrome c oxidases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The neurosteroid 3α-hydroxysteroid-5α-pregnan-20-one (allopregnanolone) acts as a positive allosteric modulator of γ-aminobutyric acid at γ-aminobutyric acid type A receptors and hence is a powerful anxiolytic, anticonvulsant, and anesthetic agent. Allopregnanolone is synthesized from progesterone by reduction to 5α-dihydroprogesterone, mediated by 5α-reductase, and by reduction to allopregnanolone, mediated by 3α-hydroxysteroid dehydrogenase (3α-HSD). Previous reports suggested that some selective serotonin reuptake inhibitors (SSRIs) could alter concentrations of allopregnanolone in human cerebral spinal fluid and in rat brain sections. We determined whether SSRIs directly altered the activities of either 5α-reductase or 3α-HSD, using an in vitro system containing purified recombinant proteins. Although rats appear to express a single 3α-HSD isoform, the human brain contains several isoforms of this enzyme, including a new isoform we cloned from human fetal brains. Our results indicate that the SSRIs fluoxetine, sertraline, and paroxetine decrease the Km of the conversion of 5α-dihydroprogesterone to allopregnanolone by human 3α-HSD type III 10- to 30-fold. Only sertraline inhibited the reverse oxidative reaction. SSRIs also affected conversions of androgens to 3α- and 3α, 17β-reduced or -oxidized androgens mediated by 3α-HSD type IIBrain. Another antidepressant, imipramine, was without any effect on allopregnanolone or androstanediol production. The region-specific expression of 3α-HSD type IIBrain and 3α-HSD type III mRNAs suggest that SSRIs will affect neurosteroid production in a region-specific manner. Our results may thus help explain the rapid alleviation of the anxiety and dysphoria associated with late luteal phase dysphoria disorder and major unipolar depression by these SSRIs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transgenic mice that overexpress mutant human amyloid precursor protein (APP) exhibit one hallmark of Alzheimer’s disease pathology, namely the extracellular deposition of amyloid plaques. Here, we describe significant deposition of amyloid β (Aβ) in the cerebral vasculature [cerebral amyloid angiopathy (CAA)] in aging APP23 mice that had striking similarities to that observed in human aging and Alzheimer’s disease. Amyloid deposition occurred preferentially in arterioles and capillaries and within individual vessels showed a wide heterogeneity (ranging from a thin ring of amyloid in the vessel wall to large plaque-like extrusions into the neuropil). CAA was associated with local neuron loss, synaptic abnormalities, microglial activation, and microhemorrhage. Although several factors may contribute to CAA in humans, the neuronal origin of transgenic APP, high levels of Aβ in cerebrospinal fluid, and regional localization of CAA in APP23 mice suggest transport and drainage pathways rather than local production or blood uptake of Aβ as a primary mechanism underlying cerebrovascular amyloid formation. APP23 mice on an App-null background developed a similar degree of both plaques and CAA, providing further evidence that a neuronal source of APP/Aβ is sufficient to induce cerebrovascular amyloid and associated neurodegeneration.