820 resultados para Graph-based approach


Relevância:

80.00% 80.00%

Publicador:

Resumo:

• Aim: The present study aimed to evaluate the effect of trainees’ interpersonal behavior on work involvement (WI) and compared their social behavior within professional and private relationships as well as between different psychotherapeutic orientations. • Methods: The interpersonal scales of the Intrex short-form questionnaire and the Work Involvement Scale (WIS) were used to evaluate two samples of German psychotherapy trainees in psychoanalytic, psychodynamic, and cognitive behavioral therapy training. Trainees from Sample 1 (N = 184) were asked to describe their interpersonal behavior in relation to their patients when filling out the Intrex, whereas trainees from Sample 2 (N = 135) were asked to describe the private relationship with a significant other. • Results: Interpersonal affiliation in professional relationships significantly predicted the level of healing involvement, while stress involvement was predicted by interpersonal affiliation and interdependence in trainees’ relationships with their patients. Social behavior within professional relationships provided higher correlations with WI than private interpersonal behavior. Significant differences were found between private and professional relation settings in trainees’ interpersonal behavior with higher levels of affiliation and interdependence with significant others. Differences between therapeutic orientation and social behavior could only be found when comparing trainees’ level of interdependence with the particular relationship setting. • Conclusion: Trainees’ interpersonal level of affiliation in professional relationships is a predictor for a successful psychotherapeutic development. Vice versa, controlling behavior in professional settings can be understood as a risk factor against psychotherapeutic growth. Both results strengthen an evidence-based approach for competence development during psychotherapy training.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As exploration of our solar system and outerspace move into the future, spacecraft are being developed to venture on increasingly challenging missions with bold objectives. The spacecraft tasked with completing these missions are becoming progressively more complex. This increases the potential for mission failure due to hardware malfunctions and unexpected spacecraft behavior. A solution to this problem lies in the development of an advanced fault management system. Fault management enables spacecraft to respond to failures and take repair actions so that it may continue its mission. The two main approaches developed for spacecraft fault management have been rule-based and model-based systems. Rules map sensor information to system behaviors, thus achieving fast response times, and making the actions of the fault management system explicit. These rules are developed by having a human reason through the interactions between spacecraft components. This process is limited by the number of interactions a human can reason about correctly. In the model-based approach, the human provides component models, and the fault management system reasons automatically about system wide interactions and complex fault combinations. This approach improves correctness, and makes explicit the underlying system models, whereas these are implicit in the rule-based approach. We propose a fault detection engine, Compiled Mode Estimation (CME) that unifies the strengths of the rule-based and model-based approaches. CME uses a compiled model to determine spacecraft behavior more accurately. Reasoning related to fault detection is compiled in an off-line process into a set of concurrent, localized diagnostic rules. These are then combined on-line along with sensor information to reconstruct the diagnosis of the system. These rules enable a human to inspect the diagnostic consequences of CME. Additionally, CME is capable of reasoning through component interactions automatically and still provide fast and correct responses. The implementation of this engine has been tested against the NEAR spacecraft advanced rule-based system, resulting in detection of failures beyond that of the rules. This evolution in fault detection will enable future missions to explore the furthest reaches of the solar system without the burden of human intervention to repair failed components.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this report, a face recognition system that is capable of detecting and recognizing frontal and rotated faces was developed. Two face recognition methods focusing on the aspect of pose invariance are presented and evaluated - the whole face approach and the component-based approach. The main challenge of this project is to develop a system that is able to identify faces under different viewing angles in realtime. The development of such a system will enhance the capability and robustness of current face recognition technology. The whole-face approach recognizes faces by classifying a single feature vector consisting of the gray values of the whole face image. The component-based approach first locates the facial components and extracts them. These components are normalized and combined into a single feature vector for classification. The Support Vector Machine (SVM) is used as the classifier for both approaches. Extensive tests with respect to the robustness against pose changes are performed on a database that includes faces rotated up to about 40 degrees in depth. The component-based approach clearly outperforms the whole-face approach on all tests. Although this approach isproven to be more reliable, it is still too slow for real-time applications. That is the reason why a real-time face recognition system using the whole-face approach is implemented to recognize people in color video sequences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a general, trainable architecture for object detection that has previously been applied to face and peoplesdetection with a new application to car detection in static images. Our technique is a learning based approach that uses a set of labeled training data from which an implicit model of an object class -- here, cars -- is learned. Instead of pixel representations that may be noisy and therefore not provide a compact representation for learning, our training images are transformed from pixel space to that of Haar wavelets that respond to local, oriented, multiscale intensity differences. These feature vectors are then used to train a support vector machine classifier. The detection of cars in images is an important step in applications such as traffic monitoring, driver assistance systems, and surveillance, among others. We show several examples of car detection on out-of-sample images and show an ROC curve that highlights the performance of our system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present a component based person detection system that is capable of detecting frontal, rear and near side views of people, and partially occluded persons in cluttered scenes. The framework that is described here for people is easily applied to other objects as well. The motivation for developing a component based approach is two fold: first, to enhance the performance of person detection systems on frontal and rear views of people and second, to develop a framework that directly addresses the problem of detecting people who are partially occluded or whose body parts blend in with the background. The data classification is handled by several support vector machine classifiers arranged in two layers. This architecture is known as Adaptive Combination of Classifiers (ACC). The system performs very well and is capable of detecting people even when all components of a person are not found. The performance of the system is significantly better than a full body person detector designed along similar lines. This suggests that the improved performance is due to the components based approach and the ACC data classification structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a type-based approach to statically derive symbolic closed-form formulae that characterize the bounds of heap memory usages of programs written in object-oriented languages. Given a program with size and alias annotations, our inference system will compute the amount of memory required by the methods to execute successfully as well as the amount of memory released when methods return. The obtained analysis results are useful for networked devices with limited computational resources as well as embedded software.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Considering the difficulty in the insulin dosage selection and the problem of hyper- and hypoglycaemia episodes in type 1 diabetes, dosage-aid systems appear as tremendously helpful for these patients. A model-based approach to this problem must unavoidably consider uncertainty sources such as the large intra-patient variability and food intake. This work addresses the prediction of glycaemia for a given insulin therapy face to parametric and input uncertainty, by means of modal interval analysis. As result, a band containing all possible glucose excursions suffered by the patient for the given uncertainty is obtained. From it, a safer prediction of possible hyper- and hypoglycaemia episodes can be calculated

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The scientific community has been suffering from peer review for decades. This process (also called refereeing) subjects an author's scientific work or ideas to the scrutiny of one or more experts in the field. Publishers use it to select and screen manuscript submissions, and funding agencies use it to award research funds. The goal is to get authors to meet their discipline's standards and thus achieve scientific objectivity. Publications and awards that haven't undergone peer review are often regarded with suspicion by scholars and professionals in many fields. However, peer review, although universally used, has many drawbacks. We propose replacing peer review with an auction-based approach: the better the submitted paper, the more scientific currency the author likely bid to have it published. If the bid correctly reflects the paper's quality, the author is rewarded in this new scientific currency; otherwise, the author loses this currency. We argue that citations are an appropriate currency for all scientists. We believe that citation auctions encourage scientists to better control their submissions' quality. It also inspire them to prepare more exciting talks for accepted papers and to invite discussion of their results at congresses and conferences and among their colleagues. In the long run, citation auctions could have the power to greatly improve scientific research

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The UK Professional Standards Framework (UK PSF) for teaching and supporting learning, launched in February 2006, is a flexible framework which uses a descriptor-based approach to professional standards. There are three standard descriptors each of which is applicable to a number of staff roles and to different career stages of those engaged in teaching and supporting learning. The standard descriptors are underpinned by areas of professional activity, core knowledge and professional values. The framework provides a reference point for institutions and individuals as well as supporting ongoing development within any one standard descriptor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Telmex es una organización que inicia su operación como empresa estatal mexicana con bajos niveles de eficiencia, posterior a su privatización tomó un nuevo rumbo en sus procesos internos, llevándola a generar modelos de negocio diferenciados y enfocados a la prestación de servicios de telecomunicación a la vanguardia. A partir de la búsqueda de nuevos horizontes la compañía logra tener presencia en la mayoría de los países del continente americano operando bajo la marca de “Telmex Internacional”, y a su vez, con sus respectivas subsidiarias en cada uno de los diferentes países, con el objetivo de satisfacer las necesidades locales con mayor efectividad. Con el apoyo de los diferentes acercamientos teóricos se busca identificar los patrones que le permitieron a Telmex llegar a posicionarse como una compañía líder en el sector de las telecomunicaciones en Latinoamérica y específicamente en Colombia, manteniéndose en un mercado competitivo mediante la oferta de servicios de empaquetamiento ajustados a las necesidades de los clientes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las capacidades dinámicas constituyen un aporte importante a la estrategia empresarial. De acuerdo con esta premisa se desarrolla el siguiente documento, al reconocer que la generación de competencias se consolida como la base teórica para el logro de sostenibilidad ante eventos de cambio que puedan afectar la estabilidad y la toma de decisiones de las organizaciones. Dada la falta de aplicación empírica del concepto se ha elaborado este paper, en el que se demuestran e identifican las herramientas que la aplicación empiríca puede dar a las organizaciones y los instrumentos que proveen para la generación de valor. A través del caso de estudio ASOS.COM se ejemplifica la necesidad de detección y aprovechamiento de oportunidades y amenazas, así como la reconfiguración, renovación y generación de competencias de segundo orden para enfrentar el cambio. De esta manera por medio de las habilidades creadas al interior de las empresas con enfoque en el aprendizaje e innovación se logra la comprensión del negocio y el afianzamiento de mejores escenarios futuros.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El objetivo principal de este trabajo es realizar una revisión teórica de los estudios que han elaborado un análisis acerca de la Inteligencia Emocional con la capacidad para afrontar situaciones generadoras de estrés. Los diferentes estudios muestran que niveles altos en Inteligencia Emocional se relacionan con estrategias de afrontamiento basadas en el análisis y resolución de conflictos, mientras que niveles bajos de inteligencia emocional se relacionan con estrategias de afrontamiento basadas en la evitación, la superstición, y la resistencia al cambio. La evidencia que arrojan los estudios indican que la inteligencia emocional es fundamental en el autocontrol emocional y en la habilidad de adaptación de los individuos para afrontar situaciones generadoras de estrés.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La Cooperación Internacional para el desarrollo se ha caracterizado por una evolución constante a lo largo de las últimas tres décadas. Las bases sobre las cuales se han practicado dicha cooperación han sido reformuladas, impactando la forma en que los diversos agentes involucrados interactúan. En la primera parte de éste trabajo se busca caracterizar la naturaleza de la interacción entre agentes dentro del proceso de cooperación; para ello recurrimos a la Teoría de Juegos, en particular a los Juegos Cooperativos en su modalidad de Acuerdo; introduciendo el concepto de óptimo de Pareto y el postulado de eficiencia de Coase. La segunda parte de éste trabajo es dedicada al concepto de Desarrollo. Describimos su evolución -caracterizada por la ruptura de paradigmas-; exponemos dos enfoques: uno basado en el cómo y para quién y otro temporario que hace referencia al corto y largo plazo; resaltando que el enfoque actual es aquel centrado en los elementos humanos. Por otra parte, analizamos el rol que tiene la Ayuda Oficial al Desarrollo (AOD), desde un punto de vista político, permitiéndonos entrever los intereses implícitos de la misma en los Estados receptores. Finalmente, describimos los elementos críticos de la evolución de las relaciones y la cooperación para el desarrollo entre América latina y la Unión Europea, así como la relación de Colombia con ésta última. Adicionalmente, detallamos el importante rol que las Organizaciones No Gubernamentales (ONG) han tenido para el desarrollo de los proyectos generados dentro del marco de las relaciones de cooperación entre América Latina y la Unión Europea.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las infecciones asociadas a ventilación mecánica (VM) son frecuentes en la unidad de cuidado intensivo (UCI). Existen dos infecciones: neumonía (NAV) y traqueobronquitis (TAV). NAV genera impacto negativo en los desenlaces de los pacientes al aumentar la morbilidad, mortalidad y los tiempos en UCI y VM, pero no se conoce el impacto de TAV. El objetivo de este estudio fue identificar si hay diferencias entre NAV y TAV. Materiales y métodos: Se realizó un estudio de cohortes entre 2009 y 2013 en la UCI de la Fundación Neumológica Colombiana. De los pacientes con NAV y TAV se obtuvieron datos demográficos, epidemiológicos, microbiológicos y desenlaces como tiempos de estancia en UCI, VM y de hospitalización y mortalidad. Se compararon estadísticamente mediante t de Student y Chi2 para datos normales y prueba de Mann-Whitney para datos no normales. Resultados: Los pacientes con NAV y TAV fueron similares en la condición de ingreso a UCI. Al diagnóstico de la infección hubo diferencias significativas entre grupos en la oxigenación y tiempo de estancia hospitalaria, en UCI y VM. La microbiología fue con predominio de gérmenes Gram negativos y presencia de multirresistencia en el 52.5% de casos, sin diferencias significativas entre grupos. En los desenlaces, se observó diferencias en los tiempos totales de estancia en UCI, hospitalización y VM, pero sin diferencia en ellos después del diagnóstico. No hubo diferencias significativas en mortalidad. Conclusiones: NAV y TAV son similares en el impacto sobre la evolución de los pacientes en cuanto a morbilidad, estancias y mortalidad.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En este trabajo se realiza la medición del riesgo de mercado para el portafolio de TES de un banco colombiano determinado, abordando el pronóstico de valor en riesgo (VaR) mediante diferentes modelos multivariados de volatilidad: EWMA, GARCH ortogonal, GARCH robusto, así como distintos modelos de VaR con distribución normal y distribución t-student, evaluando su eficiencia con las metodologías de backtesting propuestas por Candelon et al. (2011) con base en el método generalizado de momentos, junto con los test de independencia y de cobertura condicional planteados por Christoffersen y Pelletier (2004) y por Berkowitz, Christoffersen y Pelletier (2010). Los resultados obtenidos demuestran que la mejor especificación del VaR para la medición del riesgo de mercado del portafolio de TES de los bancos colombianos, es el construido a partir de volatilidades EWMA y basado en la distribución normal, ya que satisface las hipótesis de cobertura no condicional, independencia y cobertura condicional, al igual que los requerimientos estipulados en Basilea II y en la normativa vigente en Colombia.