939 resultados para methods of analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Bayesian approach to estimating the intraclass correlation coefficient was used for this research project. The background of the intraclass correlation coefficient, a summary of its standard estimators, and a review of basic Bayesian terminology and methodology were presented. The conditional posterior density of the intraclass correlation coefficient was then derived and estimation procedures related to this derivation were shown in detail. Three examples of applications of the conditional posterior density to specific data sets were also included. Two sets of simulation experiments were performed to compare the mean and mode of the conditional posterior density of the intraclass correlation coefficient to more traditional estimators. Non-Bayesian methods of estimation used were: the methods of analysis of variance and maximum likelihood for balanced data; and the methods of MIVQUE (Minimum Variance Quadratic Unbiased Estimation) and maximum likelihood for unbalanced data. The overall conclusion of this research project was that Bayesian estimates of the intraclass correlation coefficient can be appropriate, useful and practical alternatives to traditional methods of estimation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Con el objetivo de determinar el grupo de publicaciones nucleares a considerar en el desarrollo de la colección de la Biblioteca del IAR, se realiza un estudio bibliométrico de la producción y del consumo de literatura científica de los investigadores de la institución a la que la biblioteca pertenece. A partir del análisis de referencias de los trabajos publicados por los investigadores se determinan la obsolescencia y la utilidad de la literatura consultada. Mediante la extracción de palabras clave y de los autores se determinan también los frentes de investigación del instituto y los grupos de investigadores que trabajan en esos frentes, aplicando los métodos de análisis de co-ocurrencia de palabras, coautorías y análisis de redes sociales. Los resultados dan cuenta de una baja obsolescencia para la literatura consultada, de una elevada preferencia para consultar y publicar en dos o tres títulos de publicaciones periódicas de la disciplina, y demuestran finalmente la existencia de dos frentes de investigación dentro de la institución

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A technique of zooplankton net sampling at night in the Kandalaksha and Dvinskii Bays and during the full tide in the Onezhskii Bay of the White Sea allowed us to obtain "clean" samples without considerable admixtures of terrigenous particulates. Absence of elements-indicators of the terrigenous particulates (Al, Ti, and Zr) in the EDX spectra allows to conclude that ash composition of tested samples is defined by constitutional elements comprising organic matter and integument (chitin, shells) of plankton organisms. A quantitative assessment of accumulation of ca. 40 chemical elements by zooplankton based on a complex of modern physical methods of analysis is presented. Values of the coefficient of the biological accumulation of the elements (Kb) calculated for organic matter and the enrichment factors (EF) relative to Clarke concentrations in shale are in general determined by mobility of the chemical elements in aqueous solution, which is confirmed by calculated chemical speciation of the elements in the inorganic subsystem of surface waters of Onezhskii Bay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Con el objetivo de determinar el grupo de publicaciones nucleares a considerar en el desarrollo de la colección de la Biblioteca del IAR, se realiza un estudio bibliométrico de la producción y del consumo de literatura científica de los investigadores de la institución a la que la biblioteca pertenece. A partir del análisis de referencias de los trabajos publicados por los investigadores se determinan la obsolescencia y la utilidad de la literatura consultada. Mediante la extracción de palabras clave y de los autores se determinan también los frentes de investigación del instituto y los grupos de investigadores que trabajan en esos frentes, aplicando los métodos de análisis de co-ocurrencia de palabras, coautorías y análisis de redes sociales. Los resultados dan cuenta de una baja obsolescencia para la literatura consultada, de una elevada preferencia para consultar y publicar en dos o tres títulos de publicaciones periódicas de la disciplina, y demuestran finalmente la existencia de dos frentes de investigación dentro de la institución

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Con el objetivo de determinar el grupo de publicaciones nucleares a considerar en el desarrollo de la colección de la Biblioteca del IAR, se realiza un estudio bibliométrico de la producción y del consumo de literatura científica de los investigadores de la institución a la que la biblioteca pertenece. A partir del análisis de referencias de los trabajos publicados por los investigadores se determinan la obsolescencia y la utilidad de la literatura consultada. Mediante la extracción de palabras clave y de los autores se determinan también los frentes de investigación del instituto y los grupos de investigadores que trabajan en esos frentes, aplicando los métodos de análisis de co-ocurrencia de palabras, coautorías y análisis de redes sociales. Los resultados dan cuenta de una baja obsolescencia para la literatura consultada, de una elevada preferencia para consultar y publicar en dos o tres títulos de publicaciones periódicas de la disciplina, y demuestran finalmente la existencia de dos frentes de investigación dentro de la institución

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La gran cantidad de datos que se registran diariamente en los sistemas de base de datos de las organizaciones ha generado la necesidad de analizarla. Sin embargo, se enfrentan a la complejidad de procesar enormes volúmenes de datos a través de métodos tradicionales de análisis. Además, dentro de un contexto globalizado y competitivo las organizaciones se mantienen en la búsqueda constante de mejorar sus procesos, para lo cual requieren herramientas que les permitan tomar mejores decisiones. Esto implica estar mejor informado y conocer su historia digital para describir sus procesos y poder anticipar (predecir) eventos no previstos. Estos nuevos requerimientos de análisis de datos ha motivado el desarrollo creciente de proyectos de minería de datos. El proceso de minería de datos busca obtener desde un conjunto masivo de datos, modelos que permitan describir los datos o predecir nuevas instancias en el conjunto. Implica etapas de: preparación de los datos, procesamiento parcial o totalmente automatizado para identificar modelos en los datos, para luego obtener como salida patrones, relaciones o reglas. Esta salida debe significar un nuevo conocimiento para la organización, útil y comprensible para los usuarios finales, y que pueda ser integrado a los procesos para apoyar la toma de decisiones. Sin embargo, la mayor dificultad es justamente lograr que el analista de datos, que interviene en todo este proceso, pueda identificar modelos lo cual es una tarea compleja y muchas veces requiere de la experiencia, no sólo del analista de datos, sino que también del experto en el dominio del problema. Una forma de apoyar el análisis de datos, modelos y patrones es a través de su representación visual, utilizando las capacidades de percepción visual del ser humano, la cual puede detectar patrones con mayor facilidad. Bajo este enfoque, la visualización ha sido utilizada en minería datos, mayormente en el análisis descriptivo de los datos (entrada) y en la presentación de los patrones (salida), dejando limitado este paradigma para el análisis de modelos. El presente documento describe el desarrollo de la Tesis Doctoral denominada “Nuevos Esquemas de Visualizaciones para Mejorar la Comprensibilidad de Modelos de Data Mining”. Esta investigación busca aportar con un enfoque de visualización para apoyar la comprensión de modelos minería de datos, para esto propone la metáfora de modelos visualmente aumentados. ABSTRACT The large amount of data to be recorded daily in the systems database of organizations has generated the need to analyze it. However, faced with the complexity of processing huge volumes of data over traditional methods of analysis. Moreover, in a globalized and competitive environment organizations are kept constantly looking to improve their processes, which require tools that allow them to make better decisions. This involves being bettered informed and knows your digital story to describe its processes and to anticipate (predict) unanticipated events. These new requirements of data analysis, has led to the increasing development of data-mining projects. The data-mining process seeks to obtain from a massive data set, models to describe the data or predict new instances in the set. It involves steps of data preparation, partially or fully automated processing to identify patterns in the data, and then get output patterns, relationships or rules. This output must mean new knowledge for the organization, useful and understandable for end users, and can be integrated into the process to support decision-making. However, the biggest challenge is just getting the data analyst involved in this process, which can identify models is complex and often requires experience not only of the data analyst, but also the expert in the problem domain. One way to support the analysis of the data, models and patterns, is through its visual representation, i.e., using the capabilities of human visual perception, which can detect patterns easily in any context. Under this approach, the visualization has been used in data mining, mostly in exploratory data analysis (input) and the presentation of the patterns (output), leaving limited this paradigm for analyzing models. This document describes the development of the doctoral thesis entitled "New Visualizations Schemes to Improve Understandability of Data-Mining Models". This research aims to provide a visualization approach to support understanding of data mining models for this proposed metaphor visually enhanced models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El incremento de la esperanza de vida en los países desarrollados (más de 80 años en 2013), está suponiendo un crecimiento considerable en la incidencia y prevalencia de enfermedades discapacitantes, que si bien pueden aparecer a edades tempranas, son más frecuentes en la tercera edad, o en sus inmediaciones. Enfermedades neuro-degenerativas que suponen un gran hándicap funcional, pues algunas de ellas están asociadas a movimientos involuntarios de determinadas partes del cuerpo, sobre todo de las extremidades. Tareas cotidianas como la ingesta de alimento, vestirse, escribir, interactuar con el ordenador, etc… pueden llegar a ser grandes retos para las personas que las padecen. El diagnóstico precoz y certero resulta fundamental para la prescripción de la terapia o tratamiento óptimo. Teniendo en cuenta incluso que en muchos casos, por desgracia la mayoría, sólo se puede actuar para mitigar los síntomas, y no para sanarlos, al menos de momento. Aun así, acertar de manera temprana en el diagnóstico supone proporcionar al enfermo una mayor calidad de vida durante mucho más tiempo, por lo cual el esfuerzo merece, y mucho, la pena. Los enfermos de Párkinson y de temblor esencial suponen un porcentaje importante de la casuística clínica en los trastornos del movimiento que impiden llevar una vida normal, que producen una discapacidad física y una no menos importante exclusión social. Las vías de tratamiento son dispares de ahí que sea crítico acertar en el diagnóstico lo antes posible. Hasta la actualidad, los profesionales y expertos en medicina, utilizan unas escalas cualitativas para diferenciar la patología y su grado de afectación. Dichas escalas también se utilizan para efectuar un seguimiento clínico y registrar la historia del paciente. En esta tesis se propone una serie de métodos de análisis y de identificación/clasificación de los tipos de temblor asociados a la enfermedad de Párkinson y el temblor esencial. Empleando técnicas de inteligencia artificial basadas en clasificadores inteligentes: redes neuronales (MLP y LVQ) y máquinas de soporte vectorial (SVM), a partir del desarrollo e implantación de un sistema para la medida y análisis objetiva del temblor: DIMETER. Dicho sistema además de ser una herramienta eficaz para la ayuda al diagnóstico, presenta también las capacidades necesarias para proporcionar un seguimiento riguroso y fiable de la evolución de cada paciente. ABSTRACT The increase in life expectancy in developed countries in more than 80 years (data belongs to 2013), is assuming considerable growth in the incidence and prevalence of disabling diseases. Although they may appear at an early age, they are more common in the elderly ages or in its vicinity. Nuero-degenerative diseases that are a major functional handicap, as some of them are associated with involuntary movements of certain body parts, especially of the limbs. Everyday tasks such as food intake, dressing, writing, interact with the computer, etc ... can become large debris for people who suffer. Early and accurate diagnosis is crucial for prescribing optimal therapy or treatment. Even taking into account that in many cases, unfortunately the majority, can only act to mitigate the symptoms, not to cure them, at least for now. Nevertheless, early diagnosis may provide the patient a better quality of life for much longer time, so the effort is worth, and much, grief. Sufferers of Parkinson's and essential tremor represent a significant percentage of clinical casuistry in movement disorders that prevent a normal life, leading to physical disability and not least social exclusion. There are various treatment methods, which makes it necessary the immediate diagnosis. Up to date, professionals and medical experts, use a qualitative scale to differentiate the disease and degree of involvement. Therefore, those scales are used in clinical follow-up. In this thesis, several methods of analysis and identification / classification of types of tremor associated with Parkinson's disease and essential tremor are proposed. Using artificial intelligence techniques based on intelligent classification: neural networks (MLP and LVQ) and support vector machines (SVM), starting from the development and implementation of a system for measuring and objective analysis of the tremor: DIMETER. This system besides being an effective tool to aid diagnosis, it also has the necessary capabilities to provide a rigorous and reliable monitoring of the evolution of each patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Si hubiese un denominador común entre todas las artes en lo que ha venido llamándose postmodernidad, éste tendría mucho que ver con el final del origen de la obra. Desde la literatura y la música hasta las artes plásticas y la arquitectura, la superación de la modernidad ha estado caracterizada por la sustitución del concepto de creación por el de intervención artística, o lo que es lo mismo, la interpretación de lo que ya existe. A principios del siglo XX los conceptos modernos de creación y origen implicaban tener que desaprender y olvidar todo lo anterior con el ánimo de partir desde cero; incluso en un sentido material Mies sugería la construcción literal de la materia y su movimiento de acuerdo a unas leyes. A partir de la segunda mitad de siglo los planteamientos historicistas empezaron a surgir como reacción ante la amnesia y la supuesta originalidad de los modernos. En este contexto surgen los libros Learning from Las Vegas, 1972 y Delirious New York, 1978, ambos deudores en muchos aspectos con el anterior libro de Venturi, Complexity and Contradiction in Architecture, 1966. Estos dos libros sobre ciudades, alejándose decididamente de las tendencias historicistas de la época, proponían utilizar el análisis crítico de la realidad existente como vehículo para la teoría y el proyecto de manera simultánea, convirtiéndose indirectamente en Manifiestos. Si en un primer momento Venturi, Rossi y otros planteaban acabar con los límites formales establecidos por la modernidad, así como por cualquiera de los cánones anteriores, tomando la totalidad de la obra construida como sistema de referencia, - al igual que hiciera Eliot en literatura, - los libros de Las Vegas y Nueva York sugerían directamente borrar los límites de la propia disciplina, llegando a poner en duda ¿Qué puede ser considerado arquitectura? Sin embargo, debido precisamente a la ausencia total de límites y a la inmensidad del sistema referencial planteado, “todo puede ser arquitectura”, como apuntaba Hans Hollein en 1968, los libros proponen al mismo tiempo definir el campo de actuación de cada cual de manera individual. Los escritos sobre Las Vegas y Nueva York suponen por un lado la eliminación de los limites disciplinares y por otro, la delimitación de ámbitos de trabajo concretos para sus autores: los propios de cada una de las ciudades interpretadas. La primera parte de la Tesis, Lecciones, se ocupa del necesario proceso de aprendizaje y experimentación previo a la acción crítica propiamente dicha. Los arquitectos contemporáneos necesitan acumular material, conocimiento, documentación, experiencias... antes de lanzarse a proponer mediante la crítica y la edición; y al contrario que ocurría con los modernos, cuanto más abundante sea ese bagaje previo más rica será la interpretación. Las ciudades de Roma, Londres y Berlín se entienden por tanto como experiencias capaces de proporcionar a Venturi, Scott Brown y Koolhaas respectivamente, sus “personales diccionarios”, unas interminables imaginerías con las que posteriormente se enfrentarían a los análisis de Las Vegas y Nueva York. La segunda parte, Críticas, se centra en la producción teórica en sí: los dos libros de ciudades analizados en estrecha relación con el Complexity and Contradiction. El razonamiento analógico característico de estos libros ha servido de guía metodológica para la investigación, estableciéndose relaciones, no entre los propios escritos directamente, sino a través de trabajos pertenecientes a otras disciplinas. En primer lugar se plantea un importante paralelismo entre los métodos de análisis desarrollados en estos libros y los utilizados por la crítica literaria, observando que si el new criticism y el nuevo periodismo sirvieron de guía en los escritos de Venturi y Scott Brown, la nouvelle critique y su propuesta de identificación poética fueron el claro referente de Koolhaas al abordar Nueva York. Por otro lado, la relevancia ganada por la actividad de comisariado artístico y la aparición de la figura del curator, como autoridad capaz de utilizar la obra de arte por encima de las intenciones de su propio autor, sirve, al igual que la figura del editor, como reflejo de la acción transformadora y de apropiación llevada a cabo tanto en Learning from Las Vegas, como en Delirious New York. Por último y a lo largo de toda la investigación las figuras de Bergson y Baudelaire han servido como apoyo teórico. A través de la utilización que de sus ideas hicieron Venturi y Koolhaas respectivamente, se ha tratado de mostrar la proximidad de ambos planteamientos desde un punto de vista ideológico. La Inclusión propuesta por Venturi y la ironía utilizada por Koolhaas, la contradicción y la paradoja, no son sino el reflejo de lógicas que en ambos casos reaccionan al mismo tiempo contra idealismo y materialismo, contra modernidad y antimodernidad, en un continuo intento de ser lo uno y lo otro simultáneamente. ABSTRACT If there was a common denominator among all the arts in what has been called postmodernism, it would have much to do with the end of the origin of the artwork. From literature and music to fine arts and architecture, overcoming modernity has been characterized by replacing the concept of artistic creation by the one of intervention, in other words, the interpretation of what already exists. In the early twentieth century modern concepts of creation and origin involved unlearning and forgetting everything before with the firm intention of starting from scratch. Even in a material sense Mies suggested the literal construction of matter and its motion according to laws. From the mid-century historicist approaches began to emerge in response to the amnesia and originality alleged by moderns. In this context appeared the books Learning from Las Vegas, 1972 and Delirious New York, 1978, both debtors in many respects to the previous book by Venturi, Complexity and Contradiction in Architecture, 1966. These two books on cities, which broke away decidedly with the historicist trends of the time, proposed using critical analysis of the existing reality as a vehicle for theory and projecting at the same time, indirectly becoming manifests. If at first Venturi, Rossi and others pose to erase the formal limits set by modernity, as well as any of the canons before, taking the entire work built as a reference system, - as did Eliot in literature - the books on Las Vegas and New York proposed directly erasing the boundaries of the discipline itself, coming to question what could be considered architecture? However, and precisely because of the absence of limits and the immensity of the established framework, - “everything could be architecture” as Hans Hollein pointed in 1968, - the books suggested at the same time the definition of a field of action for each one individually. The cities of Las Vegas and New York represented on the one hand the elimination of disciplinary limits and on the other, the delimitation of specific areas of work to its authors: Those on each of the cities interpreted. The first part of the thesis, Lessons, attend to the necessary process of learning and experimentation before the critical action itself. Contemporary architects need to accumulate material, knowledge, information, experiences... before proposing through criticism and editing; and unlike happened with moderns, the most abundant this prior baggage is, the richest will be the interpretation. Rome, London and Berlin are therefore understood as experiences capable of providing Venturi, Scott Brown and Koolhaas respectively, their “personal dictionaries”, interminable imageries with which they would later face the analysis of Las Vegas and New York. The second part, Critiques, focuses on the theoretical production itself: the two books on both cities analyzed closely with the Complexity and Contradiction. The analogical reasoning characteristic of these books has served as a methodological guide for the research, establishing relationships, not directly between the writings themselves, but through works belonging to other disciplines. First, an important parallel is set between the methods of analysis developed in these books and those used by literary criticism, noting that if the new criticism and new journalism guided Venturi and Scott Brown´s writings, the nouvelle critique and its poetic identification were clear references for Koolhaas when addressing New York. On the other hand, the relevance gained by curating and the understanding of the figure of the curator as an authority capable to use artworks above the intentions of their authors, like the one of the Editor, reflects the appropriation and processing actions carried out both in Learning from Las Vegas, and Delirious New York. Finally and over all the research Bergson and Baudelaire figures resonate continuously. Through the use of their ideas done by Venturi and Koolhaas respectively, the research has tried to show the proximity of both approaches from an ideological point of view. Inclusion, as posed by Venturi and irony, as used by Koolhaas, contradiction and paradox are reflections of the logic that in both cases allow them to react simultaneously against idealism and materialism, against modernism and anti-modernism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vein grafts are still the most commonly used graft material in cardiovascular surgery and much effort has been spent in recent years on investigating the optimal harvesting technique. One other related topic of similar importance remained more or less an incidental one. The storage solutions of vein grafts following procurement and prior to implantation are, despite their assumed impact, a relatively neglected theme. There is no doubt that the endothelium plays a key role in long-term patency of vein grafts, but the effects of the different storage solutions on the endothelium remain unclear : In a review of the literature, we could find 20 specific papers that addressed the question, of which the currently available preservation solutions are superior, harmless, damaging or ineffective. The focus lies on saline and autologous whole blood. Besides these two storage media, novel or alternative solutions have been investigated with surprising findings. In addition, a few words will be spent on potential alternatives and novel solutions on the market. As there is currently no randomized clinical trial regarding saline versus autologous whole blood available, this review compares all previous studies and methods of analysis to provide a certain level of evidence on this topic. In summary, saline has negative effects on the endothelial layers and therefore may compromise graft patency. Related factors, such as distension pressure, may outbalance the initial benefit of autologous whole blood or storage solutions and intensify the harmful effects of warm saline. In addition, there is no uniform consent on the superiority of autologous whole blood for vein graft storage. This may open the door to alternatives such as the University of Wisconsin solution or one of the specific designed storage solutions like TiProtec™ or Somaluthion™. Whether these preservation solutions are superior or advantageous remains the subject of further studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaccinium myrtillus or bilberry fruit is a commonly used herbal product. The usual method of determining the anthocyanin content is a single-wavelength spectrophotometric assay. Using this method, anthocyanin levels of two extracts were found to be 25% as claimed by the manufacturers. When high-performance liquid chromatography (HPLC) was used, however, one extract was found to contain 9% anthocyanins probably not derived from V. myrtillus but from an adulterant. This adulterant was subsequently identified, using HPLC, mass spectroscopy, and nuclear magnetic resonance, as amaranth, that is, 3-hydroxy-4-[(4-sulfo-1-naphthalenyl)azo]-2,7-naphthalenedisulfonic acid trisodium saltsa synthetic dark red sulfonic acid based naphthylazo dye. As described in this study, if deliberate adulteration occurs in an extract, a single-wavelength spectrophotometric assay is inadequate to accurately determine the levels of compounds such as anthocyanins. Detection of deliberate adulteration in commercial samples thus requires the use of alternative, more sophisticated, methods of analysis such as HPLC with photodiode array detection as a minimum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We would like to reply to the comments made by Paparazzo on our recent paper [1] on the “effect of curve fitting parameters on quantitative analysis of Fe0.94O and Fe2O3 using XPS”. There have been many studies on the characterisation of the properties of iron oxide surfaces. The main purpose of writing the paper was to demonstrate the extent to which the selection of input parameters for curve fitting can affect the results of the quantitative analysis, and to use this analysis to develop more consistent, reproducible and quantitative methods of analysis of these data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose. The prevalence of myopia is known to vary with age, ethnicity, level of education, and socioeconomic status, with a high prevalence reported in university students and in people from East Asian countries. This study determines the prevalence of ametropia in a mixed ethnicity U.K. university student population and compares associated ocular biometric measures. Methods. Refractive error and related ocular component data were collected on 373 first-year U.K. undergraduate students (mean age = 19.55 years ± 2.99, range = 17-30 years) at the start of the academic year at Aston University, Birmingham, and the University of Bradford, West Yorkshire. The ethnic variation of the students was as follows: white 38.9%, British Asian 58.2%, Chinese 2.1%, and black 0.8%. Noncycloplegic refractive error was measured with an infrared open-field autorefractor, the Shin-Nippon NVision-K 5001 (Shin Nippon, Ryusyo Industrial Co. Ltd, Osaka, Japan). Myopia was defined as a mean spherical equivalent (MSE) less than or equal to -0.50 D. Hyperopia was defined as an MSE greater than or equal to +0.50 D. Axial length, corneal curvature, and anterior chamber depth were measured using the Zeiss IOLMaster (Carl Zeiss, Jena, GmBH). Results. The analysis was carried out only for white and British Asian groups. The overall distribution of refractive error exhibited leptokurtosis, and prevalence levels were similar for white and British Asian (the predominant ethnic group) students across each ametropic group: myopia (50% vs. 53.4%), hyperopia (18.8% vs. 17.3%), and emmetropia (31.2% vs. 29.3%). There were no significant differences in the distribution of ametropia and biometric components between white and British Asian samples. Conclusion. The absence of a significant difference in refractive error and ocular components between white and British Asian students exposed to the same educational system is of interest. However, it is clear that a further study incorporating formal epidemiologic methods of analysis is required to address adequately the recent proposal that juvenile myopia develops principally from myopiagenic environments and is relatively independent of ethnicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The wear rates of sliding surfaces are significantly reduced if mild oxidational wear can be encouraged. It is hence of prime importance in the interest of component life and material conservation to understand the factors necessary to promote mild, oxidational wear, The present work investigates the fundamental mechanism of the running-in wear of BS EN 31!EN 8 steel couples. under various conditions of load. speed and test duration. Unidirectional sliding experiments were carried out on a pin-on~disc wear machine where frictional force, wear rate, temperature and contact resistance were continuously monitored during each test. Physical methods of analysis (x-ray, scanning electron microscopy etc.) were used to examine the wear debris and worn samples. The wear rate versus load curves revealed mild wear transitions, which under long duration of running, categorized mild wear into four distinct regions.α-Fe20s. Fe304, FeO and an oxide mixture were the predominant oxides in four regions of oxidational wear which were identified above the Welsh T2 transition. The wear curves were strongly effected by the speed and test duration. A surface model was used to calculate the surface parameters, and the results were found to be comparable with the experimentally observed parameters. Oxidation was responsible for the transition from severe to mild wear at a load corresponding to the Welsh T2 transition. In the running-in period sufficient energy input and surface hardness enabled oxide growth rate to increase and eventually exceeded the rate of removal, where mild wear ensued. A model was developed to predict the wear volume up to the transition. Remarkable agreement was found between the theoretical prediction and the experimentally-measured values. The oxidational mechanjsm responsible for transitjon to mild wear under equilibrium conditions was related to the formation of thick homogenous oxide plateaux on subsurface hardened layers, FeO was the oxide formed initially at the onset of mild wear but oxide type changed.during the total running period to give an equilibrium oxide whose nature depended on the loads applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biochemical changes brought about by the influence of the contact lens on the tear film are conveniently split into two categories. Firstly, the lens can remove or reduce the levels of specific components in the tear film, and secondly, the lens can augment the tear film, by stimulating the influx of new components or increasing the level of existing components. The most obvious tear film components for study in this context are lipids, proteins, mucins and electrolytes. The interactions are affected by the properties of the lens, the characteristics of the individual wearer and the wear schedule. An additional complicating factor is the fact that the lens is many times thicker than the tear film and any immobilised tear components will be more extensively exposed to oxygen and UV radiation than is the case in the absence of a lens. It is arguably the lipoidal components that are most markedly affected by lens wear, since their immobilisation on the lens surface markedly increases their susceptibility to autoxidative degradation. The limited information that is available highlights the importance of subject specificity and suggests that lipid oxidation phenomena are potentially important in contributing to the 'end of day' discomfort of symptomatic contact lens patients. It is clear that tear lipids, although regarded as relatively inert for many years, are now seen as a reactive and potentially important family of compounds in the search for understanding of contact lens-induced discomfort. The influence of the lens on tear proteins shows the greatest range of complexity. Deposition and denaturation can stimulate immune response, lower molecular weight proteins can be extensively absorbed into the lens matrix and the lens can stimulate cascade or upregulation processes leading either to the generation of additional proteins and peptides or an increase in concentration of existing components. Added to this is the stimulating influence of the lens on vascular leakage leading to the influx of plasma proteins such as albumin. The evidence from studies of mucin expression in tears is not consistent and conclusive. This is in part because sample sources, lens materials and methods of analysis vary considerably, and in some cases the study population numbers are low. Expression levels show mucin and material specificity but clear patterns of behaviour are elusive. The electrolyte composition of tears is significantly different from that of other body fluids. Sodium and potassium dominate but potassium ion concentrations in tears are much higher than in serum levels. Calcium and magnesium concentrations in tears are lower than in serum but closer to interstitial fluids. The contact lens provides the potential for increased osmolarity through enhanced evaporation and differential electrolyte concentrations between the anterior and posterior tear films. Since the changes in ocular biochemistry consequent upon contact lens wear are known to be subject-dependent - as indeed is wearer response to the lens - pre-characterisation of individual participant tear chemistry in clinical studies would enhance understanding of these complex effects. © 2013 Elsevier Ltd.