816 resultados para mecanismos de cinemática paralela
Resumo:
Trabajo de suficiencia profesional
Resumo:
Pese a que un objetivo de la epidemiología es la identificación de relaciones de causalidad entre un factor de riesgo y un problema de salud, la metodología de investigación de esta disciplina sacrifica a menudo la validez interna a favor de la capacidad de detección de asociación. Existen métodos gráficos y estadísticos que pueden ayudar a desentrañar los posibles mecanismos causales y así conocer algo mejor la llamada "caja negra". En esta nota se presentan los diagramas causales, una de las herramientas más útiles para plantear, antes del análisis, si una posible asociación es causal o simplemente debida a un sesgo. Para mostrar su utilidad, se proponen varias situaciones en el ámbito de salud laboral, mostrando cómo puede surgir asociación en rutas no causales a consecuencia de un sesgo. En conclusión, se recomienda el uso de los diagramas causales como parte de la praxis habitual en la investigación epidemiológica.
Resumo:
The growing demand for large-scale virtualization environments, such as the ones used in cloud computing, has led to a need for efficient management of computing resources. RAM memory is the one of the most required resources in these environments, and is usually the main factor limiting the number of virtual machines that can run on the physical host. Recently, hypervisors have brought mechanisms for transparent memory sharing between virtual machines in order to reduce the total demand for system memory. These mechanisms “merge” similar pages detected in multiple virtual machines into the same physical memory, using a copy-on-write mechanism in a manner that is transparent to the guest systems. The objective of this study is to present an overview of these mechanisms and also evaluate their performance and effectiveness. The results of two popular hypervisors (VMware and KVM) using different guest operating systems (Linux and Windows) and different workloads (synthetic and real) are presented herein. The results show significant performance differences between hypervisors according to the guest system workloads and execution time.
Resumo:
The last couple of decades have been the stage for the introduction of new telecommunication networks. It is expected that in the future all types of vehicles, such as cars, buses and trucks have the ability to intercommunicate and form a vehicular network. Vehicular networks display particularities when compared to other networks due to their continuous node mobility and their wide geographical dispersion, leading to a permanent network fragmentation. Therefore, the main challenges that this type of network entails relate to the intermittent connectivity and the long and variable delay in information delivery. To address the problems related to the intermittent connectivity, a new concept was introduced – Delay Tolerant Network (DTN). This architecture is built on a Store-Carry-and-Forward (SCF) mechanism in order to assure the delivery of information when there is no end-to-end path defined. Vehicular networks support a multiplicity of services, including the transportation of non-urgent information. Therefore, it is possible to conclude that the use of a DTN for the dissemination of non-urgent information is able to surpass the aforementioned challenges. The work developed focused on the use of DTNs for the dissemination of non-urgent information. This information is originated in the network service provider and should be available on mobile network terminals during a limited period of time. In order to do so, four different strategies were deployed: Random, Least Number of Hops First (LNHF), Local Rarest Bundle First (LRBF) e Local Rarest Generation First (LRGF). All of these strategies have a common goal: to disseminate content into the network in the shortest period of time and minimizing network congestion. This work also contemplates the analysis and implementation of techniques that reduce network congestion. The design, implementation and validation of the proposed strategies was divided into three stages. The first stage focused on creating a Matlab emulator for the fast implementation and strategy validation. This stage resulted in the four strategies that were afterwards implemented in the DTNs software Helix – developed in a partnership between Instituto de Telecomunicac¸˜oes (IT) and Veniam R , which are responsible for the largest operating vehicular network worldwide that is located in Oporto city. The strategies were later evaluated on an emulator that was built for the largescale testing of DTN. Both emulators account for vehicular mobility based on information previously collected from the real platform. Finally, the strategy that presented the best overall performance was tested on a real platform – in a lab environment – for concept and operability demonstration. It is possible to conclude that two of the implemented strategies (LRBF and LRGF) can be deployed in the real network and guarantee a significant delivery rate. The LRBF strategy has the best performance in terms of delivery. However, it needs to add a significant overhead to the network in order to work. In the future, tests of scalability should be conducted in a real environment in order to confirm the emulator results. The real implementation of the strategies should be accompanied by the introduction of new types of services for content distribution.
Resumo:
Tese (doutorado)—Universidade de Brasília, Instituto de Química, Programa de Pós-Graduação em Química, 2015.
Resumo:
Esta investigación estudia y analiza una de las estrategias de financiamiento que han utilizado empresas del sector real, instituciones del sistema financiero y gobiernos latinoamericanos, para la consecución de ahorro externo a invertir en el desarrollo de la región. La estrategia básica comprende la emisión ADRs. Nuestro interés se centrará en Colombia, debido a que son pocas las empresas que han utilizado este mecanismo de financiación y, por tanto, consideramos necesario difundir esta estrategia por ser una alternativa bastante favorable.
Resumo:
Nowadays it is still difficult to perform an early and accurate diagnosis of dementia, therefore many research focus on the finding of new dementia biomarkers that can aid in that purpose. So scientists try to find a noninvasive, rapid, and relatively inexpensive procedures for early diagnosis purpose. Several studies demonstrated that the utilization of spectroscopic techniques, such as Fourier Transform Infrared Spectroscopy (FTIR) and Raman spectroscopy could be an useful and accurate procedure to diagnose dementia. As several biochemical mechanisms related to neurodegeneration and dementia can lead to changes in plasma components and others peripheral body fluids, blood-based samples and spectroscopic analyses can be used as a more simple and less invasive technique. This work is intended to confirm some of the hypotheses of previous studies in which FTIR was used in the study of plasma samples of possible patient with AD and respective controls and verify the reproducibility of this spectroscopic technique in the analysis of such samples. Through the spectroscopic analysis combined with multivariate analysis it is possible to discriminate controls and demented samples and identify key spectroscopic differences between these two groups of samples which allows the identification of metabolites altered in this disease. It can be concluded that there are three spectral regions, 3500-2700 cm -1, 1800-1400 cm-1 and 1200-900 cm-1 where it can be extracted relevant spectroscopic information. In the first region, the main conclusion that is possible to take is that there is an unbalance between the content of saturated and unsaturated lipids. In the 1800-1400 cm-1 region it is possible to see the presence of protein aggregates and the change in protein conformation for highly stable parallel β-sheet. The last region showed the presence of products of lipid peroxidation related to impairment of membranes, and nucleic acids oxidative damage. FTIR technique and the information gathered in this work can be used in the construction of classification models that may be used for the diagnosis of cognitive dysfunction.
Resumo:
Conferencia y posterior debate sobre la interacción entre la Corte Penal Internacional y la jurisdicción universal como mecanismos complementarios en la lucha contra la impunidad y el enjuciamiento y castigo de los más graves crímenes internacionales
Resumo:
El presente trabajo comprende el estudio de los mecanismos de control internos de las Fuerzas y Cuerpos de Seguridad del Estado, en concreto de los dos principales cuerpos que son, el Cuerpo Nacional de Policía y la Guardia Civil. En el primer apartado del trabajo se analiza qué son las Fuerzas y Cuerpos de Seguridad del Estado describiendo las funciones, los principios básicos de actuación y los límites que impone el Estado a las actuaciones de las Fuerzas y Cuerpos de Seguridad. Para ello, examinaremos la Ley Orgánica 2/1986, de 13 de marzo, de Fuerzas y Cuerpos de Seguridad del Estado así como las diferentes monografías que aparecen citadas en la bibliografía. Además en el primer apartado se analiza la Ley Orgánica 4/2015, de 30 de marzo, de Protección de la Seguridad Ciudadana, estudiando los objetivos, el ámbito de aplicación, los fines y los principios de dicha ley así como los diferentes recursos de inconstitucionalidad que se han interpuesto contra la misma. El segundo apartado, nos centramos en las estrategias y mecanismos de control de los Cuerpos y Fuerzas de Seguridad del Estado, detallando tanto el régimen disciplinario de la Guardia Civil, como el régimen disciplinario del Cuerpo Nacional de Policía, utilizando para ello los diferentes códigos. Por último, abordaremos cuáles son los abusos policiales más habituales en España y cómo se persiguen y castigan esos abusos. Además este análisis irá acompañado de diferentes sentencias dictadas entorno a sanciones administrativas impuestas a los miembros de las Fuerzas y Cuerpos de Seguridad del Estado.
Resumo:
En esta memoria se ha implementado una etapa de preprocesado que sirva como primera fase en el proceso de codificación de vídeo. Esta etapa integra dos variedades del filtro de mediana (3×3 y 5×5) y un operador. Dicho operador lleva a cabo el cálculo del gradiente de los píxeles que conforman una imagen o fotograma con objeto de filtrar después aquellos que están por debajo de un determinado valor (threshold). El cálculo de dicho threshold se realiza de manera empírica mediante dos procesos distintos. En el primero se obtienen valores de luminancia y crominancia de píxeles que integran bordes para encontrar aquel que tenga el valor mínimo, mientras que en el segundo se calcula la tasa de píxeles que forman parte de bordes. Una vez se ha realizado el cálculo anterior, se han utilizado distintos valores de threshold, distintas variedades de filtro de mediana y distintos valores de QP (calidad) con objeto de parametrizar las codificaciones que hacen uso de esta nueva etapa. Posteriormente a dichas codificaciones, se han obtenido los tamaños de los bitstreams de salida y se ha evaluado la calidad de los vídeos decodificados o reconstruidos mediante dos métricas objetivas: PSNR y SSIM. Las codificaciones que no utilizan etapa de preprocesado también han sido evaluadas mediante dichas métricas y comparadas con aquellas que sí integran dicha etapa. Los resultados obtenidos dejan patente el compromiso existente entre tamaño de bitstream y calidad, siendo más representativos los de la métrica SSIM, estando esta última más relacionada con la percepción de la imagen por parte del HVS (sistema visual humano). Como resultado, se obtiene para esta métrica tasas de compresión mayores que las alcanzadas sin preprocesamiento, con pérdidas de calidad prácticamente inapreciables.
Resumo:
This research aimed at relating coordination and control forms to organizational performance. The multicase study was applied in two public high schools: Centro Federal de Educação Tecnológica do Rio Grande do Norte and Floriano Cavalcanti. In order to accomplish these objectives, it was developed a qualitative analysis and considered coordination and control forms of several authors. Also was considered Sander´s (1984) model of organizational performance. The mentioned model considers two criteria to analyze organizational performance: one instrumental (efficiency and efficacy) and other substantive (effectiveness e relevance). The research attempts to show the importance of balancing these criteria in a way that effectiveness and relevance becomes more important at schools. It was proven that the use of bureaucratic coordination forms has the power to influence the evaluation on the instrumental technique. At the same time, it was observed that the use of mechanisms based on the autonomy of the school is related to efficiency and efficacy. The object of this research can be considered successful
Resumo:
Tese de Doutoramento em Biologia Comportamental apresentada ao ISPA - Instituto Universitário