856 resultados para Population set-based methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new information-theoretic approach is presented for finding the pose of an object in an image. The technique does not require information about the surface properties of the object, besides its shape, and is robust with respect to variations of illumination. In our derivation, few assumptions are made about the nature of the imaging process. As a result the algorithms are quite general and can foreseeably be used in a wide variety of imaging situations. Experiments are presented that demonstrate the approach registering magnetic resonance (MR) images with computed tomography (CT) images, aligning a complex 3D object model to real scenes including clutter and occlusion, tracking a human head in a video sequence and aligning a view-based 2D object model to real images. The method is based on a formulation of the mutual information between the model and the image called EMMA. As applied here the technique is intensity-based, rather than feature-based. It works well in domains where edge or gradient-magnitude based methods have difficulty, yet it is more robust than traditional correlation. Additionally, it has an efficient implementation that is based on stochastic approximation. Finally, we will describe a number of additional real-world applications that can be solved efficiently and reliably using EMMA. EMMA can be used in machine learning to find maximally informative projections of high-dimensional data. EMMA can also be used to detect and correct corruption in magnetic resonance images (MRI).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this session we look at how to think systematically about a problem and create a solution. We look at the definition and characteristics of an algorithm, and see how through modularisation and decomposition we can then choose a set of methods to create. We also compare this somewhat procedural approach, with the way that design works in Object Oriented Systems,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objetivo: realizar un análisis comparativo entre un grupo control y pacientes con trauma craneoencefálico, TCE, para determinar si existen diferencias neuropsicológicas a los seis meses de evolución y así orientar programas de intervención acordes con las necesidades de esta población. Materiales y métodos: se evaluó un total de setenta y nueve pacientes con antecedente de TCE con mínimo de seis meses de evolución y setenta y nueve sujetos en grupo control, el cual presentó una escolaridad promedio de once años frente a nueve años del grupo de TCE; ambos grupos con una media de treinta y cuatro años de edad, sin antecedentes neurológicos y/o psiquiátricos. La media del Glasgow en el grupo de TCE se ubicó en un rango moderado con una puntuación de once. Se aplicó la evaluación neuropsicológica breve en español Neuropsi a los dos grupos. Resultados: los grupos muestran diferencias significativas (p≤0,05) en las tareas de orientación, atención, memoria, lenguaje, lectura y escritura. Conclusiones: el TCE deja secuelas neuropsicológicas significativas, aún seis meses después de ocurrido el evento traumático. Estos hallazgos sugieren que los pacientes con TCE requieren de tratamiento después de superar la etapa inicial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A través de la elaboración de un análisis de mercado, de la situación actual del producto a exportar a nivel mundial y del sector agrícola tanto Colombiano como Japonés, y tomando como referente las diversas teorías de comercio internacional, se logra establecer un plan de negocio viable, en donde se define la forma correcta y factible de llevar a cabo el proceso de exportación, y de esta manera lograr el posicionamiento de un fruto exótico colombiano en el exterior. A partir del análisis realizado al sector agrícola Colombiano y Japonés, y teniendo en cuenta la situación actual de la pitahaya amarilla a nivel mundial, se logró identificar el comportamiento de la oferta y la demanda de este fruto, y la frecuencia con la que se realizan las exportaciones, con el fin de establecer las necesidades del mercado de Japón y aprovechar la ventaja comparativa que Colombia tiene frente a este país para poder suplir el mercado. Ante la indiscutible escasez de frutos exóticos como lo es la pitahaya amarilla en un país como Japón, dadas las condiciones geográficas con las que cuenta, y al ser este uno de los productos más apetecidos en esta sociedad, fue el motivo por el cual este país se escogió como destino para realizar el proceso de exportación. Adicional a esto, la ubicación geográfica de ambos países, y sus regulaciones y normas internacionales, hacen posible y viable la ejecución del presente estudio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objetivo Determinar la prevalencia de dolor lumbar y su relación con los factores de riesgo biomecánico en el personal de enfermería de una entidad de salud de cuarto nivel en la ciudad de Bogotá, entre el año 2014 y 2015. Esta investigación se realizó teniendo en cuenta que el dolor lumbar tiene un alto impacto en la calidad de vida del personal de enfermería. Materiales y métodos: Es un estudio de corte transversal con exploración analítica. La población objeto de estudio está conformada por 866 trabajadores de enfermería de la cual se tomó una muestra de 265 individuos a los cuales se les aplicó un cuestionario online que indagó acerca de las características individuales, laborales de riesgo biomecánico. En la elaboración del instrumento se seleccionaron ítems de Ergopar y Nórdico. La muestra fue aleatoria, estratificada con distribución proporcional por servicio de atención y jornada laboral. Resultados La prevalencia de dolor lumbar en la población estudiada fue del 61.1% (n=162) con un intervalo de confianza del 95% (55.1-67.2). Los factores de riesgo biomecánico asociados al problema de estudio fueron: posturas que implican girar y/o inclinar la espalda o el tronco (p 0.05), y tiempo de movilización de pacientes (p 0.01). Los factores a nivel laboral que se relacionan con el dolor lumbar son: el tipo de contrato (p 0.002), las exigencias físicas del trabajo (p 0.001) y la imposibilidad para realizar el trabajo por causa del dolor lumbar (p 0.000). La prevalencia por servicios y la jornada laboral no presentaron asociación significativa. Conclusiones La prevalencia de dolor lumbar en personal de enfermería es alta y coincide con los estudios realizados a nivel nacional e internacional. La exposición a los factores de riesgo biomecánico que se relacionan con la presencia de dolor lumbar en el personal de enfermería, afectan su calidad de vida. Además, no existen diferencias significativas de acuerdo con las funciones de los cargos desempeñados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La calidad de energía eléctrica incluye la calidad del suministro y la calidad de la atención al cliente. La calidad del suministro a su vez se considera que la conforman dos partes, la forma de onda y la continuidad. En esta tesis se aborda la continuidad del suministro a través de la localización de faltas. Este problema se encuentra relativamente resuelto en los sistemas de transmisión, donde por las características homogéneas de la línea, la medición en ambos terminales y la disponibilidad de diversos equipos, se puede localizar el sitio de falta con una precisión relativamente alta. En sistemas de distribución, sin embargo, la localización de faltas es un problema complejo y aún no resuelto. La complejidad es debida principalmente a la presencia de conductores no homogéneos, cargas intermedias, derivaciones laterales y desbalances en el sistema y la carga. Además, normalmente, en estos sistemas sólo se cuenta con medidas en la subestación, y un modelo simplificado del circuito. Los principales esfuerzos en la localización han estado orientados al desarrollo de métodos que utilicen el fundamental de la tensión y de la corriente en la subestación, para estimar la reactancia hasta la falta. Como la obtención de la reactancia permite cuantificar la distancia al sitio de falta a partir del uso del modelo, el Método se considera Basado en el Modelo (MBM). Sin embargo, algunas de sus desventajas están asociadas a la necesidad de un buen modelo del sistema y a la posibilidad de localizar varios sitios donde puede haber ocurrido la falta, esto es, se puede presentar múltiple estimación del sitio de falta. Como aporte, en esta tesis se presenta un análisis y prueba comparativa entre varios de los MBM frecuentemente referenciados. Adicionalmente se complementa la solución con métodos que utilizan otro tipo de información, como la obtenida de las bases históricas de faltas con registros de tensión y corriente medidos en la subestación (no se limita solamente al fundamental). Como herramienta de extracción de información de estos registros, se utilizan y prueban dos técnicas de clasificación (LAMDA y SVM). Éstas relacionan las características obtenidas de la señal, con la zona bajo falta y se denominan en este documento como Métodos de Clasificación Basados en el Conocimiento (MCBC). La información que usan los MCBC se obtiene de los registros de tensión y de corriente medidos en la subestación de distribución, antes, durante y después de la falta. Los registros se procesan para obtener los siguientes descriptores: a) la magnitud de la variación de tensión ( dV ), b) la variación de la magnitud de corriente ( dI ), c) la variación de la potencia ( dS ), d) la reactancia de falta ( Xf ), e) la frecuencia del transitorio ( f ), y f) el valor propio máximo de la matriz de correlación de corrientes (Sv), cada uno de los cuales ha sido seleccionado por facilitar la localización de la falta. A partir de estos descriptores, se proponen diferentes conjuntos de entrenamiento y validación de los MCBC, y mediante una metodología que muestra la posibilidad de hallar relaciones entre estos conjuntos y las zonas en las cuales se presenta la falta, se seleccionan los de mejor comportamiento. Los resultados de aplicación, demuestran que con la combinación de los MCBC con los MBM, se puede reducir el problema de la múltiple estimación del sitio de falta. El MCBC determina la zona de falta, mientras que el MBM encuentra la distancia desde el punto de medida hasta la falta, la integración en un esquema híbrido toma las mejores características de cada método. En este documento, lo que se conoce como híbrido es la combinación de los MBM y los MCBC, de una forma complementaria. Finalmente y para comprobar los aportes de esta tesis, se propone y prueba un esquema de integración híbrida para localización de faltas en dos sistemas de distribución diferentes. Tanto los métodos que usan los parámetros del sistema y se fundamentan en la estimación de la impedancia (MBM), como aquellos que usan como información los descriptores y se fundamentan en técnicas de clasificación (MCBC), muestran su validez para resolver el problema de localización de faltas. Ambas metodologías propuestas tienen ventajas y desventajas, pero según la teoría de integración de métodos presentada, se alcanza una alta complementariedad, que permite la formulación de híbridos que mejoran los resultados, reduciendo o evitando el problema de la múltiple estimación de la falta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho de investigação consiste num estudo empírico sobre uma nova ciência - a Biblioterapia, e a situação mais específica, a sua aplicação em contexto prisional. Para alcançar este objetivo formulámos a seguinte hipótese: “o livro exerce sobre as pessoas em geral e, de uma forma mais particular, sobre indivíduos em contexto de detenção institucional uma função terapêutica”. A metodologia adotada para esta investigação assenta essencialmente em fontes documentais textuais e digitais, sobre autores especialistas na matéria, que de uma forma mais precisa se identificam e citam especificamente no capítulo referente à revisão da literatura. Para atingir o nosso propósito começámos por definir os conceitos de leitura, de leitura terapêutica, de fenomenologia da linguagem, de terapia e de diálogo assim como procurámos saber quais os fundamentos filosóficos e os componentes biblioterapêuticos, público-alvo e áreas com maiores potencialidades de aplicação, benefícios e limitações que norteiam à aplicação da Biblioterapia, nomeadamente, para fins de desenvolvimento pessoal e de alterações comportamentais. Fez-se igualmente uma análise da evolução do conceito, desde Aristóteles até aos nossos dias. Só com o conhecimento efetivo desta realidade seria possível prosseguir o nosso objeto de estudo. Entendemos ainda ser importante perceber qual é a situação da leitura em Portugal e as políticas desenvolvidas nesta área pelo Estado, para promover a leitura junto dos estudantes e da população em geral. Apenas assente nestas duas realidades que se entrecruzam, no processo de leitor/livro é possível identificar e relacionar esta problemática em estudo. Constatamos a existência de dois tipos de Biblioterapia, a Biblioterapia-arte e a Biblioterapia-ciência, identificando algumas das suas características, principalmente o que as aproxima e o que as separa. Apresentamos um exemplo de uma área reconhecidamente bem sucedida na aplicação da Biblioterapia não-clínica, em instituições prisionais, efetuando uma abordagem às questões mais pertinentes neste campo: os seus aspetos institucionais, os psicossociais e os de reinserção social. Considerámos ainda as recomendações da IFLA para este setor populacional – Guidelines for Library Services to Prisoners. Ao longo do trabalho e de uma forma transversal realçamos as interrelações biblioterapêuticas entre pacientes, bibliotecários e outros intervenientes no processo biblioterapêutico, tais como médicos e orientadores de leitura. Finalizamos o trabalho concluindo que a prática de leitura de temas específicos, em qualquer campo de atuação e sob orientação de um elemento profissional com conhecimentos aprofundados tanto em relação às personalidades e problemas dos destinatários como aos materiais que tem à sua disposição – o livro, em sentido lato - pode efetivamente desencadear benefícios terapêuticos nos leitores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particle size distribution (psd) is one of the most important features of the soil because it affects many of its other properties, and it determines how soil should be managed. To understand the properties of chalk soil, psd analyses should be based on the original material (including carbonates), and not just the acid-resistant fraction. Laser-based methods rather than traditional sedimentation methods are being used increasingly to determine particle size to reduce the cost of analysis. We give an overview of both approaches and the problems associated with them for analyzing the psd of chalk soil. In particular, we show that it is not appropriate to use the widely adopted 8 pm boundary between the clay and silt size fractions for samples determined by laser to estimate proportions of these size fractions that are equivalent to those based on sedimentation. We present data from field and national-scale surveys of soil derived from chalk in England. Results from both types of survey showed that laser methods tend to over-estimate the clay-size fraction compared to sedimentation for the 8 mu m clay/silt boundary, and we suggest reasons for this. For soil derived from chalk, either the sedimentation methods need to be modified or it would be more appropriate to use a 4 pm threshold as an interim solution for laser methods. Correlations between the proportions of sand- and clay-sized fractions, and other properties such as organic matter and volumetric water content, were the opposite of what one would expect for soil dominated by silicate minerals. For water content, this appeared to be due to the predominance of porous, chalk fragments in the sand-sized fraction rather than quartz grains, and the abundance of fine (<2 mu m) calcite crystals rather than phyllosilicates in the clay-sized fraction. This was confirmed by scanning electron microscope (SEM) analyses. "Of all the rocks with which 1 am acquainted, there is none whose formation seems to tax the ingenuity of theorists so severely, as the chalk, in whatever respect we may think fit to consider it". Thomas Allan, FRS Edinburgh 1823, Transactions of the Royal Society of Edinburgh. (C) 2009 Natural Environment Research Council (NERC) Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are now considerable expectations that semi-distributed models are useful tools for supporting catchment water quality management. However, insufficient attention has been given to evaluating the uncertainties inherent to this type of model, especially those associated with the spatial disaggregation of the catchment. The Integrated Nitrogen in Catchments model (INCA) is subjected to an extensive regionalised sensitivity analysis in application to the River Kennet, part of the groundwater-dominated upper Thames catchment, UK The main results are: (1) model output was generally insensitive to land-phase parameters, very sensitive to groundwater parameters, including initial conditions, and significantly sensitive to in-river parameters; (2) INCA was able to produce good fits simultaneously to the available flow, nitrate and ammonium in-river data sets; (3) representing parameters as heterogeneous over the catchment (206 calibrated parameters) rather than homogeneous (24 calibrated parameters) produced a significant improvement in fit to nitrate but no significant improvement to flow and caused a deterioration in ammonium performance; (4) the analysis indicated that calibrating the flow-related parameters first, then calibrating the remaining parameters (as opposed to calibrating all parameters together) was not a sensible strategy in this case; (5) even the parameters to which the model output was most sensitive suffered from high uncertainty due to spatial inconsistencies in the estimated optimum values, parameter equifinality and the sampling error associated with the calibration method; (6) soil and groundwater nutrient and flow data are needed to reduce. uncertainty in initial conditions, residence times and nitrogen transformation parameters, and long-term historic data are needed so that key responses to changes in land-use management can be assimilated. The results indicate the general, difficulty of reconciling the questions which catchment nutrient models are expected to answer with typically limited data sets and limited knowledge about suitable model structures. The results demonstrate the importance of analysing semi-distributed model uncertainties prior to model application, and illustrate the value and limitations of using Monte Carlo-based methods for doing so. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The origins of the various procurement methods in construction are considered, followed by a discussion of the circumstances for which each procurement system is best suited. The likely future development sin management-based methods are discussed in the context of current research, showing that co-ordination of trade and specialist contractors , more flexible contracting, constructive conflict management and a potential polarization of views about project management in the industry are likely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in construction management is diverse in content and in quality. There is much to be learned from more fundamental disciplines. Construction is a sub-set of human experience rather than a completely separate phenomenon. Therefore, it is likely that there are few problems in construction requiring the invention of a completely new theory. If construction researchers base their work only on that of other construction researchers, our academic community will become less relevant to the world at large. The theories that we develop or test must be of wider applicability to be of any real interest. In undertaking research, researchers learn a lot about themselves. Perhaps the only difference between research and education is that if we are learning about something which no-one else knows, then it is research, otherwise it is education. Self-awareness of this will help to reduce the chances of publishing work which only reveals a researcher’s own learning curve. Scientific method is not as simplistic as non-scientists claim and is the only real way of overcoming methodological weaknesses in our work. The reporting of research may convey the false impression that it is undertaken in the sequence in which it is written. Construction is not so unique and special as to require a completely different set of methods from other fields of enquiry. Until our research is reported in mainstream journals and conferences, there is little chance that we will influence the wider academic community and a concomitant danger that it will become irrelevant. The most useful insights will come from research which challenges the current orthodoxy rather than research which merely reports it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in construction management is diverse in content and in quality. There is much to be learned from more fundamental disciplines. Construction is a sub-set of human experience rather than a completely separate phenomenon. Therefore, it is likely that there are few problems in construction requiring the invention of a completely new theory. If construction researchers base their work only on that of other construction researchers, our academic community will become less relevant to the world at large. The theories that we develop or test must be of wider applicability to be of any real interest. In undertaking research, researchers learn a lot about themselves. Perhaps the only difference between research and education is that if we are learning about something which no-one else knows, then it is research, otherwise it is education. Self-awareness of this will help to reduce the chances of publishing work which only reveals a researcher’s own learning curve. Scientific method is not as simplistic as non-scientists claim and is the only real way of overcoming methodological weaknesses in our work. The reporting of research may convey the false impression that it is undertaken in the sequence in which it is written. Construction is not so unique and special as to require a completely different set of methods from other fields of enquiry. Until our research is reported in mainstream journals and conferences, there is little chance that we will influence the wider academic community and a concomitant danger that it will become irrelevant. The most useful insights will come from research which challenges the current orthodoxy rather than research which merely reports it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of effective methods for predicting the quality of three-dimensional (3D) models is fundamentally important for the success of tertiary structure (TS) prediction strategies. Since CASP7, the Quality Assessment (QA) category has existed to gauge the ability of various model quality assessment programs (MQAPs) at predicting the relative quality of individual 3D models. For the CASP8 experiment, automated predictions were submitted in the QA category using two methods from the ModFOLD server-ModFOLD version 1.1 and ModFOLDclust. ModFOLD version 1.1 is a single-model machine learning based method, which was used for automated predictions of global model quality (QMODE1). ModFOLDclust is a simple clustering based method, which was used for automated predictions of both global and local quality (QMODE2). In addition, manual predictions of model quality were made using ModFOLD version 2.0-an experimental method that combines the scores from ModFOLDclust and ModFOLD v1.1. Predictions from the ModFOLDclust method were the most successful of the three in terms of the global model quality, whilst the ModFOLD v1.1 method was comparable in performance to other single-model based methods. In addition, the ModFOLDclust method performed well at predicting the per-residue, or local, model quality scores. Predictions of the per-residue errors in our own 3D models, selected using the ModFOLD v2.0 method, were also the most accurate compared with those from other methods. All of the MQAPs described are publicly accessible via the ModFOLD server at: http://www.reading.ac.uk/bioinf/ModFOLD/. The methods are also freely available to download from: http://www.reading.ac.uk/bioinf/downloads/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs) have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results: The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion: MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods tested can often be used as effective post filters for re-ranking few models from individual fold recognition servers and further improvements can be achieved using a consensus of these methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Demographic models are assuming an important role in management decisions for endangered species. Elasticity analysis and scope for management analysis are two such applications. Elasticity analysis determines the vital rates that have the greatest impact on population growth. Scope for management analysis examines the effects that feasible management might have on vital rates and population growth. Both methods target management in an attempt to maximize population growth. 2. The Seychelles magpie robin Copsychus sechellarum is a critically endangered island endemic, the population of which underwent significant growth in the early 1990s following the implementation of a recovery programme. We examined how the formal use of elasticity and scope for management analyses might have shaped management in the recovery programme, and assessed their effectiveness by comparison with the actual population growth achieved. 3. The magpie robin population doubled from about 25 birds in 1990 to more than 50 by 1995. A simple two-stage demographic model showed that this growth was driven primarily by a significant increase in the annual survival probability of first-year birds and an increase in the birth rate. Neither the annual survival probability of adults nor the probability of a female breeding at age 1 changed significantly over time. 4. Elasticity analysis showed that the annual survival probability of adults had the greatest impact on population growth. There was some scope to use management to increase survival, but because survival rates were already high (> 0.9) this had a negligible effect on population growth. Scope for management analysis showed that significant population growth could have been achieved by targeting management measures at the birth rate and survival probability of first-year birds, although predicted growth rates were lower than those achieved by the recovery programme when all management measures were in place (i.e. 1992-95). 5. Synthesis and applications. We argue that scope for management analysis can provide a useful basis for management but will inevitably be limited to some extent by a lack of data, as our study shows. This means that identifying perceived ecological problems and designing management to alleviate them must be an important component of endangered species management. The corollary of this is that it will not be possible or wise to consider only management options for which there is a demonstrable ecological benefit. Given these constraints, we see little role for elasticity analysis because, when data are available, a scope for management analysis will always be of greater practical value and, when data are lacking, precautionary management demands that as many perceived ecological problems as possible are tackled.