972 resultados para Maximum entropy statistical estimate
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
OBJECTIVE: To assess, in myocardium specimens obtained from necropsies, the correlation between the concentration of hydroxyproline, measured with the photocolorimetric method, and the intensity of fibrosis, determined with the morphometric method. METHODS: Left ventricle myocardium samples were obtained from 45 patients who had undergone necropsy, some of them with a variety of cardiopathies and others without any heart disease. The concentrations of hydroxyproline were determined with the photocolorimetric method. In the histologic sections from each heart, the myocardial fibrosis was quantified by using a light microscope with an integrating ocular lens. RESULTS: A median of, respectively, 4.5 and 4.3 mug of hydroxyproline/mg of dry weight was found in fixed and nonfixed left ventricle myocardium fragments. A positive correlation occurred between the hydroxyproline concentrations and the intensity of fibrosis, both in the fixed (Sr=+0.25; p=0.099) and in the nonfixed (Sr=+0.32; p=0.03) specimens. CONCLUSION: The biochemical methodology was proven to be adequate, and manual morphometry was shown to have limitations that may interfere with the statistical significance of correlations for the estimate of fibrosis intensity in the human myocardium.
Resumo:
Nuevas biotecnologías, como los marcadores de la molécula de ADN, permiten caracterizar el genoma vegetal. El uso de la información genómica producida para cientos o miles de posiciones cromosómicas permite identificar genotipos superiores en menos tiempo que el requerido por la selección fenotípica tradicional. La mayoría de los caracteres de las especies vegetales cultivadas de importancia agronómica y económica, son controlados por poli-genes causantes de un fenotipo con variación continua, altamente afectados por el ambiente. Su herencia es compleja ya que resulta de la interacción entre genes, del mismo o distinto cromosoma, y de la interacción del genotipo con el ambiente, dificultando la selección. Estas biotecnologías producen bases de datos con gran cantidad de información y estructuras complejas de correlación que requieren de métodos y modelos biométricos específicos para su procesamiento. Los modelos estadísticos focalizados en explicar el fenotipo a partir de información genómica masiva requieren la estimación de un gran número de parámetros. No existen métodos, dentro de la estadística paramétrica capaces de abordar este problema eficientemente. Además los modelos deben contemplar no-aditividades (interacciones) entre efectos génicos y de éstos con el ambiente que son también dificiles de manejar desde la concepción paramétrica. Se hipotetiza que el análisis de la asociación entre caracteres fenotípicos y genotipos moleculares, caracterizados por abundante información genómica, podría realizarse eficientemente en el contexto de los modelos mixtos semiparamétricos y/o de métodos no-paramétricos basados en técnicas de aprendizaje automático. El objetivo de este proyecto es desarrollar nuevos métodos para análisis de datos que permitan el uso eficiente de información genómica masiva en evaluaciones genéticas de interés agro-biotecnológico. Los objetivos específicos incluyen la comparación, respecto a propiedades estadísticas y computacionales, de estrategias analíticas paramétricas con estrategias semiparamétricas y no-paramétricas. Se trabajará con aproximaciones por regresión del análisis de loci de caracteres cuantitativos bajo distintas estrategias y escenarios (reales y simulados) con distinto volúmenes de datos de marcadores moleculares. En el área paramétrica se pondrá especial énfasis en modelos mixtos, mientras que en el área no paramétrica se evaluarán algoritmos de redes neuronales, máquinas de soporte vectorial, filtros multivariados, suavizados del tipo LOESS y métodos basados en núcleos de reciente aparición. La propuesta semiparamétrica se basará en una estrategia de análisis en dos etapas orientadas a: 1) reducir la dimensionalidad de los datos genómicos y 2) modelar el fenotipo introduciendo sólo las señales moleculares más significativas. Con este trabajo se espera poner a disposición de investigadores de nuestro medio, nuevas herramientas y procedimientos de análisis que permitan maximizar la eficiencia en el uso de los recursos asignados a la masiva captura de datos genómicos y su aplicación en desarrollos agro-biotecnológicos.
Resumo:
El objetivo de este proyecto, enmarcado en el área de metodología de análisis en bioingeniería-biotecnología aplicadas al estudio del cancer, es el análisis y caracterización a través modelos estadísticos con efectos mixtos y técnicas de aprendizaje automático, de perfiles de expresión de proteínas y genes de las vías metabolicas asociadas a progresión tumoral. Dicho estudio se llevará a cabo mediante la utilización de tecnologías de alto rendimiento. Las mismas permiten evaluar miles de genes/proteínas en forma simultánea, generando así una gran cantidad de datos de expresión. Se hipotetiza que para un análisis e interpretación de la información subyacente, caracterizada por su abundancia y complejidad, podría realizarse mediante técnicas estadístico-computacionales eficientes en el contexto de modelos mixtos y técnias de aprendizaje automático. Para que el análisis sea efectivo es necesario contemplar los efectos ocasionados por los diferentes factores experimentales ajenos al fenómeno biológico bajo estudio. Estos efectos pueden enmascarar la información subycente y así perder informacion relavante en el contexto de progresión tumoral. La identificación de estos efectos permitirá obtener, eficientemente, los perfiles de expresión molecular que podrían permitir el desarrollo de métodos de diagnóstico basados en ellos. Con este trabajo se espera poner a disposición de investigadores de nuestro medio, herramientas y procedimientos de análisis que maximicen la eficiencia en el uso de los recursos asignados a la masiva captura de datos genómicos/proteómicos que permitan extraer información biológica relevante pertinente al análisis, clasificación o predicción de cáncer, el diseño de tratamientos y terapias específicos y el mejoramiento de los métodos de detección como así tambien aportar al entendimieto de la progresión tumoral mediante análisis computacional intensivo.
Resumo:
El proyecto tiene como propósito caracterizar la variabilidad de la paleocirculación atmosférica en las latitudes medias de Sudamérica, su efecto sobre la fluctuación hidroclimática regional y la vulnerabilidad humana frente a los cambios ocurridos desde el Ultimo Máximo Glacial/Holoceno. El enfoque inter y multidisciplinaro aquí planteado para analizar la varibiliad hidroclimática pasada, sus causas y consecuencias, es inédito para esta región del país. El mismo contempla: a) análisis de archivos climáticos sedimentarios con una aproximación de multi-indicadores (sedimentología, geoquímica, isótopos estables y radiogénicos, mineralogía, ostrácodos y moluscos); b) determinación de la dinámica actual y pasada del polvo atmosférico (PA) combinando mediciones in situ y en registros sedimentarios y c) análisis de restos óseos humanos y malacológicos en sitios arqueológicos.Se contempla: a) Efectuar análisis de multi-indicadores de registros climáticos naturales almacenados en sistemas lacustres de la región Pampeana (S. Ambargasta, Mar Chiquita, Pocho, Melincué, Lagunas Encadenadas del Oeste de Buenos Aires) y en secuencias loessicas para inferir la variabilidad de la circulación atmosférica desde el UMG; b) Ampliar la resolución temporal de las reconstrucciones climáticas para ventanas de tiempo seleccionadas; c) Analizar la señal geoquímica del registro sedimentario de fases climáticas contrastantes; d) Identificar la variabilidad temporal de la procedencia y de los procesos actuantes mediante análisis mineralógicos y geoquímicos; e) Analizar el ambiente actual para calibrar indicadores ambientales o proxies (isótopos, flujo de sedimentos, geoquímica, moluscos y ostrácodos) con el escenario climático contemporáneo; f) Analizar en conjunto los archivos climáticos para inferir patrones de paleocirculación atmosférica regional y g) Dilucidar estrategias adaptativas y la historia biológica de poblaciones humanas en la región central de Argentina durante fases climáticas diversas.Este proyecto aborda uno de los aspectos menos conocidos de las reconstrucciones paleoambientales, que está relacionado con rol del material eólico derivado del Hemisferio Sur y el impacto que genera sobre el ciclo regional del Carbono. A pesar que el sur de Sudamérica es una de las áreas claves para entender este aspecto, no se conoce de forma acabada la incidencia de los cambios ambientales sobre el flujo de PA o el efecto de futuros cambios climáticos y/o uso de la tierra.La actividad planteada tiene implicancias directas sobre múltiples disciplinas como las ciencias atmosféricas, geoquímica, sedimentología, paleoclimatologia y bioarqueología. Nuestros resultados permitirán mejorar el entendimiento del cambio climático regional, la dinámica del polvo y su rol como forzante del sistema climático, la variabilidad hidrológica presente y pasada y la respuesta por parte de las poblaciones humanas. Profundizar el estudio de los cambios paleoclimáticos y bioarqueológicos en la región permitirá analizar la variabilidad hidroclimática y determinar su relación con las situaciones de crisis y vulnerabilidad del pobamiento humano. Asimismo, la inferencia de cambios para períodos con mínima o sin influencia humana es una herramienta clave para mejorar el conocimiento de las fluctuaciones climáticas del área extratropical Sudamericana. Estos resultados permitirán analizar no sólo los mecanismos operados en el sistema climático pasado sino también aquellos factores que explicarían el gran cambio hidroclimático registrado desde 1970 cuyos efectos han impactado claramente sobre las actividades socio-económicos en la región central Argentina.
Resumo:
A partir de las últimas décadas se ha impulsado el desarrollo y la utilización de los Sistemas de Información Geográficos (SIG) y los Sistemas de Posicionamiento Satelital (GPS) orientados a mejorar la eficiencia productiva de distintos sistemas de cultivos extensivos en términos agronómicos, económicos y ambientales. Estas nuevas tecnologías permiten medir variabilidad espacial de propiedades del sitio como conductividad eléctrica aparente y otros atributos del terreno así como el efecto de las mismas sobre la distribución espacial de los rendimientos. Luego, es posible aplicar el manejo sitio-específico en los lotes para mejorar la eficiencia en el uso de los insumos agroquímicos, la protección del medio ambiente y la sustentabilidad de la vida rural. En la actualidad, existe una oferta amplia de recursos tecnológicos propios de la agricultura de precisión para capturar variación espacial a través de los sitios dentro del terreno. El óptimo uso del gran volumen de datos derivado de maquinarias de agricultura de precisión depende fuertemente de las capacidades para explorar la información relativa a las complejas interacciones que subyacen los resultados productivos. La covariación espacial de las propiedades del sitio y el rendimiento de los cultivos ha sido estudiada a través de modelos geoestadísticos clásicos que se basan en la teoría de variables regionalizadas. Nuevos desarrollos de modelos estadísticos contemporáneos, entre los que se destacan los modelos lineales mixtos, constituyen herramientas prometedoras para el tratamiento de datos correlacionados espacialmente. Más aún, debido a la naturaleza multivariada de las múltiples variables registradas en cada sitio, las técnicas de análisis multivariado podrían aportar valiosa información para la visualización y explotación de datos georreferenciados. La comprensión de las bases agronómicas de las complejas interacciones que se producen a la escala de lotes en producción, es hoy posible con el uso de éstas nuevas tecnologías. Los objetivos del presente proyecto son: (l) desarrollar estrategias metodológicas basadas en la complementación de técnicas de análisis multivariados y geoestadísticas, para la clasificación de sitios intralotes y el estudio de interdependencias entre variables de sitio y rendimiento; (ll) proponer modelos mixtos alternativos, basados en funciones de correlación espacial de los términos de error que permitan explorar patrones de correlación espacial de los rendimientos intralotes y las propiedades del suelo en los sitios delimitados. From the last decades the use and development of Geographical Information Systems (GIS) and Satellite Positioning Systems (GPS) is highly promoted in cropping systems. Such technologies allow measuring spatial variability of site properties including electrical conductivity and others soil features as well as their impact on the spatial variability of yields. Therefore, site-specific management could be applied to improve the efficiency in the use of agrochemicals, the environmental protection, and the sustainability of the rural life. Currently, there is a wide offer of technological resources to capture spatial variation across sites within field. However, the optimum use of data coming from the precision agriculture machineries strongly depends on the capabilities to explore the information about the complex interactions underlying the productive outputs. The covariation between spatial soil properties and yields from georeferenced data has been treated in a graphical manner or with standard geostatistical approaches. New statistical modeling capabilities from the Mixed Linear Model framework are promising to deal with correlated data such those produced by the precision agriculture. Moreover, rescuing the multivariate nature of the multiple data collected at each site, several multivariate statistical approaches could be crucial tools for data analysis with georeferenced data. Understanding the basis of complex interactions at the scale of production field is now within reach the use of these new techniques. Our main objectives are: (1) to develop new statistical strategies, based on the complementarities of geostatistics and multivariate methods, useful to classify sites within field grown with grain crops and analyze the interrelationships of several soil and yield variables, (2) to propose mixed linear models to predict yield according spatial soil variability and to build contour maps to promote a more sustainable agriculture.
Resumo:
The vulnerability to pollution and hydrochemical variation of groundwater in the mid-west karstic lowlands of Ireland were investigated from October 1992 to September 1993, as part of an EU STRIDE project at Sligo Regional Technical College. Eleven springs were studied in the three local authority areas of Co. Galway, Co. Mayo, and Co. Roscommon. Nine of the springs drain locally or regionally important karstic aquifers and two drain locally important sand and gravel aquifers. The maximum average daily discharge of any of the springs was 16,000 m3/day. Determination of the vulnerability of groundwater to pollution relies heavily on an examination of subsoil deposits in an area since they can act as a protecting or filtering layer over groundwater. Within aquifers/spring catchments, chemical reactions such as adsorption, solution-precipitation or acid-base reactions occur and modify the hydrochemistry of groundwater (Lloyd and Heathcote, 1985). The hydrochemical processes) that predominate depend cm the mineralogy of the aquifer, the hydrogeological environment, the overlying subsoils, and the history of groundwater movement. The aim of this MSc research thesis was to investigate the hydrochemical variation of spring outflow and to assess the relationship between these variations and the intrinsic vulnerability of the springs and their catchments. If such a relationship can be quantified, then it is hoped that the hydrochemical variation of a spring may indicate the vulnerability of a spring catchment without the need for determining it by field mapping. Such a method would be invaluable to any of the three local authorities since they would be able to prioritise sources that are most at risk from pollution, using simple techniques of chemical sampling, and statistical analysis. For each spring a detailed geological, hydrogeological and hydrochemical study was carried out. Individual catchment areas were determined with a water balance/budget and groundwater tracing. The subsoils geology for each spring catchment were mapped at the 1:10,560 scale and digitised to the 1:25,000 scale with AutoCad™ and Arclnfo™. The vulnerability of each spring was determined using the Geological Survey's vulnerability guidelines. Field measurements and laboratory based chemistry analyses of the springs were undertaken by personnel from both the EPA Regional Laboratory in Castlebar, Co. Mayo, and the Environment Section of Roscommon Co. Council. Electrical conductivity and temperature (°C) were sampled fortnightly, in the field, using a WTW microprocessor conductivity meter. A percentage (%) vulnerability was applied to each spring in order to indicate the areal extent of the four main classes of vulnerability (Extreme, High, Moderate, and Low) which occurred within the confines of each spring catchment. Hydrochemical variation for the springs were presented as the coefficient of variation of electrical conductivity. The results of this study show that a clear relationship exists between the degree of vulnerability of each catchment area as defined by the subsoil cover and the coefficient of variation of EC, with the coefficient of variation increasing as the vulnerability increases. The coefficient of variation of electrical conductivity is considered to be a parameter that gives a good general reflection of the degree of vulnerability occurring in a spring catchment in Ireland's karstic lowlands.
Resumo:
The main objective of this thesis on flooding was to produce a detailed report on flooding with specific reference to the Clare River catchment. Past flooding in the Clare River catchment was assessed with specific reference to the November 2009 flood event. A Geographic Information System was used to produce a graphical representation of the spatial distribution of the November 2009 flood. Flood risk is prominent within the Clare River catchment especially in the region of Claregalway. The recent flooding events of November 2009 produced significant fluvial flooding from the Clare River. This resulted in considerable flood damage to property. There were also hidden costs such as the economic impact of the closing of the N17 until floodwater subsided. Land use and channel conditions are traditional factors that have long been recognised for their effect on flooding processes. These factors were examined in the context of the Clare River catchment to determine if they had any significant effect on flood flows. Climate change has become recognised as a factor that may produce more significant and frequent flood events in the future. Many experts feel that climate change will result in an increase in the intensity and duration of rainfall in western Ireland. This would have significant implications for the Clare River catchment, which is already vulnerable to flooding. Flood estimation techniques are a key aspect in understanding and preparing for flood events. This study uses methods based on the statistical analysis of recorded data and methods based on a design rainstorm and rainfall-runoff model to estimate flood flows. These provide a mathematical basis to evaluate the impacts of various factors on flooding and also to generate practical design floods, which can be used in the design of flood relief measures. The final element of the thesis includes the author’s recommendations on how flood risk management techniques can reduce existing flood risk in the Clare River catchment. Future implications to flood risk due to factors such as climate change and poor planning practices are also considered.
Resumo:
Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.
Resumo:
Results are presented from the analysis of observations data on flash flood in Georgia over a period of 45 years, from 1961 to 2005, provided of the of Hydro-meteorology Service of Georgia.
Resumo:
შესწავლილია ხილვადობის სიშორის სტატისტიკური სტრუქტურა თბილისში 1980-დან 2008 წლამდე პერიოდისათვის. გამოყენებულია საქართველოს ჰიდრომეტეოროლოგიური დეპარტამენტის მონაცემები ხილვადობის სხვადასხვა ბალიანობის მქონე დღეების რიცხვის შესახებ წელიწადში 9, 12 და 15 საათზე დაკვირვებებისათვის.
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Diss., 2011
Resumo:
Background: Cardiovascular Diseases (CVD) are the leading cause of death in Brazil. Objective: To estimate total CVD, cerebrovascular disease (CBVD), and ischemic heart disease (IHD) mortality rates in adults in the counties of the state of Rio de Janeiro (SRJ), from 1979 to 2010. Methods: The counties of the SRJ were analysed according to their denominations stablished by the geopolitical structure of 1950, Each new county that have since been created, splitting from their original county, was grouped according to their former origin. Population Data were obtained from the Brazilian Institute of Geography and Statistics (IBGE), and data on deaths were obtained from DataSus/MS. Mean CVD, CBVD, and IHD mortality rates were estimated, compensated for deaths from ill-defined causes, and adjusted for age and sex using the direct method for three periods: 1979–1989, 1990–1999, and 2000–2010, Such results were spatially represented in maps. Tables were also constructed showing the mortality rates for each disease and year period. Results: There was a significant reduction in mortality rates across the three disease groups over the the three defined periods in all the county clusters analysed, Despite an initial mortality rate variation among the counties, it was observed a homogenization of such rates at the final period (2000–2010). The drop in CBVD mortality was greater than that in IHD mortality. Conclusion: Mortality due to CVD has steadily decreased in the SRJ in the last three decades. This reduction cannot be explained by greater access to high technology procedures or better control of cardiovascular risk factors as these facts have not occurred or happened in low proportion of cases with the exception of smoking which has decreased significantly. Therefore, it is necessary to seek explanations for this decrease, which may be related to improvements in the socioeconomic conditions of the population.
Resumo:
Abstract Background: The kinetics of high-sensitivity troponin T (hscTnT) release should be studied in different situations, including functional tests with transient ischemic abnormalities. Objective: To evaluate the release of hscTnT by serial measurements after exercise testing (ET), and to correlate hscTnT elevations with abnormalities suggestive of ischemia. Methods: Patients with acute ST-segment elevation myocardial infarction (STEMI) undergoing primary angioplasty were referred for ET 3 months after infarction. Blood samples were collected to measure basal hscTnT immediately before (TnT0h), 2 (TnT2h), 5 (TnT5h), and 8 hours (TnT8h) after ET. The outcomes were peak hscTnT, TnT5h/TnT0h ratio, and the area under the blood concentration-time curve (AUC) for hscTnT levels. Log-transformation was performed on hscTnT values, and comparisons were assessed with the geometric mean ratio, along with their 95% confidence intervals. Statistical significance was assessed by analysis of covariance with no adjustment, and then, adjusted for TnT0h, age and sex, followed by additional variables (metabolic equivalents, maximum heart rate achieved, anterior wall STEMI, and creatinine clearance). Results: This study included 95 patients. The highest geometric means were observed at 5 hours (TnT5h). After adjustments, peak hscTnT, TnT5h/TnT0h and AUC were 59% (p = 0.002), 59% (p = 0.003) and 45% (p = 0.003) higher, respectively, in patients with an abnormal ET as compared to those with normal tests. Conclusion: Higher elevations of hscTnT may occur after an abnormal ET as compared to a normal ET in patients with STEMI.