901 resultados para Hands
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
With this document, we provide a compilation of in-depth discussions on some of the most current security issues in distributed systems. The six contributions have been collected and presented at the 1st Kassel Student Workshop on Security in Distributed Systems (KaSWoSDS’08). We are pleased to present a collection of papers not only shedding light on the theoretical aspects of their topics, but also being accompanied with elaborate practical examples. In Chapter 1, Stephan Opfer discusses Viruses, one of the oldest threats to system security. For years there has been an arms race between virus producers and anti-virus software providers, with no end in sight. Stefan Triller demonstrates how malicious code can be injected in a target process using a buffer overflow in Chapter 2. Websites usually store their data and user information in data bases. Like buffer overflows, the possibilities of performing SQL injection attacks targeting such data bases are left open by unwary programmers. Stephan Scheuermann gives us a deeper insight into the mechanisms behind such attacks in Chapter 3. Cross-site scripting (XSS) is a method to insert malicious code into websites viewed by other users. Michael Blumenstein explains this issue in Chapter 4. Code can be injected in other websites via XSS attacks in order to spy out data of internet users, spoofing subsumes all methods that directly involve taking on a false identity. In Chapter 5, Till Amma shows us different ways how this can be done and how it is prevented. Last but not least, cryptographic methods are used to encode confidential data in a way that even if it got in the wrong hands, the culprits cannot decode it. Over the centuries, many different ciphers have been developed, applied, and finally broken. Ilhan Glogic sketches this history in Chapter 6.
Resumo:
Humans can effortlessly manipulate objects in their hands, dexterously sliding and twisting them within their grasp. Robots, however, have none of these capabilities, they simply grasp objects rigidly in their end effectors. To investigate this common form of human manipulation, an analysis of controlled slipping of a grasped object within a robot hand was performed. The Salisbury robot hand demonstrated many of these controlled slipping techniques, illustrating many results of this analysis. First, the possible slipping motions were found as a function of the location, orientation, and types of contact between the hand and object. Second, for a given grasp, the contact types were determined as a function of the grasping force and the external forces on the object. Finally, by changing the grasping force, the robot modified the constraints on the object and affect controlled slipping slipping motions.
Resumo:
The flexibility of the robot is the key to its success as a viable aid to production. Flexibility of a robot can be explained in two directions. The first is to increase the physical generality of the robot such that it can be easily reconfigured to handle a wide variety of tasks. The second direction is to increase the ability of the robot to interact with its environment such that tasks can still be successfully completed in the presence of uncertainties. The use of articulated hands are capable of adapting to a wide variety of grasp shapes, hence reducing the need for special tooling. The availability of low mass, high bandwidth points close to the manipulated object also offers significant improvements I the control of fine motions. This thesis provides a framework for using articulated hands to perform local manipulation of objects. N particular, it addresses the issues in effecting compliant motions of objects in Cartesian space. The Stanford/JPL hand is used as an example to illustrate a number of concepts. The examples provide a unified methodology for controlling articulated hands grasping with point contacts. We also present a high-level hand programming system based on the methodologies developed in this thesis. Compliant motion of grasped objects and dexterous manipulations can be easily described in the LISP-based hand programming language.
Resumo:
This thesis addresses the problem of developing automatic grasping capabilities for robotic hands. Using a 2-jointed and a 4-jointed nmodel of the hand, we establish the geometric conditions necessary for achieving form closure grasps of cylindrical objects. We then define and show how to construct the grasping pre-image for quasi-static (friction dominated) and zero-G (inertia dominated) motions for sensorless and sensor-driven grasps with and without arm motions. While the approach does not rely on detailed modeling, it is computationally inexpensive, reliable, and easy to implement. Example behaviors were successfully implemented on the Salisbury hand and on a planar 2-fingered, 4 degree-of-freedom hand.
Resumo:
This report addresses the problem of acquiring objects using articulated robotic hands. Standard grasps are used to make the problem tractable, and a technique is developed for generalizing these standard grasps to increase their flexibility to variations in the problem geometry. A generalized grasp description is applied to a new problem situation using a parallel search through hand configuration space, and the result of this operation is a global overview of the space of good solutions. The techniques presented in this report have been implemented, and the results are verified using the Salisbury three-finger robotic hand.
Resumo:
Each player in the financial industry, each bank, stock exchange, government agency, or insurance company operates its own financial information system or systems. By its very nature, financial information, like the money that it represents, changes hands. Therefore the interoperation of financial information systems is the cornerstone of the financial services they support. E-services frameworks such as web services are an unprecedented opportunity for the flexible interoperation of financial systems. Naturally the critical economic role and the complexity of financial information led to the development of various standards. Yet standards alone are not the panacea: different groups of players use different standards or different interpretations of the same standard. We believe that the solution lies in the convergence of flexible E-services such as web-services and semantically rich meta-data as promised by the semantic Web; then a mediation architecture can be used for the documentation, identification, and resolution of semantic conflicts arising from the interoperation of heterogeneous financial services. In this paper we illustrate the nature of the problem in the Electronic Bill Presentment and Payment (EBPP) industry and the viability of the solution we propose. We describe and analyze the integration of services using four different formats: the IFX, OFX and SWIFT standards, and an example proprietary format. To accomplish this integration we use the COntext INterchange (COIN) framework. The COIN architecture leverages a model of sources and receivers’ contexts in reference to a rich domain model or ontology for the description and resolution of semantic heterogeneity.
Resumo:
In this paper a precorrected FFT-Fast Multipole Tree (pFFT-FMT) method for solving the potential flow around arbitrary three dimensional bodies is presented. The method takes advantage of the efficiency of the pFFT and FMT algorithms to facilitate more demanding computations such as automatic wake generation and hands-off steady and unsteady aerodynamic simulations. The velocity potential on the body surfaces and in the domain is determined using a pFFT Boundary Element Method (BEM) approach based on the Green’s Theorem Boundary Integral Equation. The vorticity trailing all lifting surfaces in the domain is represented using a Fast Multipole Tree, time advected, vortex participle method. Some simple steady state flow solutions are performed to demonstrate the basic capabilities of the solver. Although this paper focuses primarily on steady state solutions, it should be noted that this approach is designed to be a robust and efficient unsteady potential flow simulation tool, useful for rapid computational prototyping.
Resumo:
”compositions” is a new R-package for the analysis of compositional and positive data. It contains four classes corresponding to the four different types of compositional and positive geometry (including the Aitchison geometry). It provides means for computation, plotting and high-level multivariate statistical analysis in all four geometries. These geometries are treated in an fully analogous way, based on the principle of working in coordinates, and the object-oriented programming paradigm of R. In this way, called functions automatically select the most appropriate type of analysis as a function of the geometry. The graphical capabilities include ternary diagrams and tetrahedrons, various compositional plots (boxplots, barplots, piecharts) and extensive graphical tools for principal components. Afterwards, ortion and proportion lines, straight lines and ellipses in all geometries can be added to plots. The package is accompanied by a hands-on-introduction, documentation for every function, demos of the graphical capabilities and plenty of usage examples. It allows direct and parallel computation in all four vector spaces and provides the beginner with a copy-and-paste style of data analysis, while letting advanced users keep the functionality and customizability they demand of R, as well as all necessary tools to add own analysis routines. A complete example is included in the appendix
Resumo:
Blogging has become one of the key ingredients of the so-called socials networks. This phenomenon has indeed invaded the world of education. Connections between people, comments on each other posts, and assessment of innovation are usually interesting characteristics of blogs related to students and scholars. Blogs have become a kind of new form of authority, bringing about (divergent) discussions which lead to creation of knowledge. The use of blogs as an innovative, educational tool is not at all new. However, their use in universities is not very widespread yet. Blogging for personal affairs is rather commonplace, but blogging for professional affairs – teaching, research and service, is scarce, despite the availability of ready-to-use, free tools. Unfortunately, Information Society has not reached yet enough some universities: not only are (student) blogs scarcely used as an educational tool, but it is quite rare to find a blog written by University professors. The Institute of Computational Chemistry of the University of Girona and the Department of Chemistry of the Universitat Autònoma de Barcelona has joined forces to create “InnoCiència”, a new Group on Digital Science Communitation. This group, formed by ca. ten researchers, has promoted the use of blogs, twitters. wikis and other tools of Web 2.0 in activities in Catalonia concerning the dissemination of Science, like Science Week, Open Day or Researchers’ Night. Likewise, its members promote use of social networking tools in chemistry- and communication-related courses. This communication explains the outcome of social-network experiences with teaching undergraduate students and organizing research communication events. We provide live, hands-on examples and interactive ground to show how blogs and twitters can be used to enhance the yield of teaching and research. Impact of blogging and other social networking tools on the outcome of the learning process is very depending on the target audience and the environmental conditions. A few examples are provided and some proposals to use these techniques efficiently to help students are hinted
Resumo:
El siguiente trabajo es una recopilación de las metodologías propuestas para la implantación de la Planeación Estratégica por medio del Direccionamiento Estratégico en las empresas, que como se ha visto en los últimos años, se ha hecho cada vez más importante que las empresas logren elaborar y ejecutar sus planes estratégicos para poder afrontar de mejor forma el mercado y sus condiciones cambiantes y aun mas con el avance acelerado de la globalización. La implantación y la puesta en marcha de los Planes Estratégicos en las empresas les da mejores herramientas a las empresas para enfrentar las incertidumbres del mercado, les brinda un horizonte claro y compartido a todos los empleados, saben para donde van y como lo van a lograr por medio de su misión, visión, políticas y objetivos corporativos, haciendo a las empresas perdurables y evitando la alta mortandad de las estas. Los planes estratégicos van de la mano del estudio y el análisis de su entorno, el entorno es cambiante y por lo mismo es necesario estudiarlo constantemente, es por esto que me permito mostrar también diferentes metodologías para el estudio de la competencia.
Resumo:
OBJETIVO: Cuantificar Bacterias gram positivas, Bacterias gram negativas y Hongos mediante monitoreo microbiológico de aire, superficies y manos del personal asistencial en cinco entidades de salud del departamento del Meta en el año 2007. MÉTODOS: Estudio ejecutado en cinco entidades, donde se realizó el diagnostico microbiológico del entorno hospitalario, tomando muestras en cada área asistencial de: superficies horizontales antes y después de aplicar el protocolo de limpieza y desinfección de cada institución, manos de personal asistencial antes y después del lavado rutinario de manos y del aire de áreas críticas y no críticas. RESULTADOS: La IPS 3 presentó el Aire de Zona crítica con mayor contaminación bacteriana y la IPS 4 mayor contaminación fúngica. Hospitalización, Urgencias y Apoyo diagnostico evidenciaron las mayores concentraciones microbianas. Se encontraron diferencias estadísticamente significativas entre la carga microbiana antes respecto a después del lavado de manos (p<0,05) y antes y después de la aplicación del protocolo de limpieza y desinfección de superficies. CONCLUSIONES: Con el presente estudio fue posible demostrar que la capacitación, supervisión y monitorización de los procesos de lavado de manos y limpieza y desinfección de superficies pueden llegar a garantizar la reducción de la biomasa bacteriana y fúngica presente en las entidades de la salud.
Resumo:
Objetivo: determinar la prevalencia en los últimos 6 meses de los síntomas de cuello y miembro superior además de sus factores asociados, en trabajadores de una entidad financiera call center en el periodo comprendido de abril a octubre del año 2009. Métodos: se realizó un análisis descriptivo trasversal, a través de la aplicación de un cuestionario de morbilidad sentida que abarcó aspectos demográficos, antecedentes personales y antecedentes laborales. La presencia de los síntomas se documentó en una tabla donde se confrontaron los síntomas osteomusculares y los segmentos afectados en los últimos 6 meses. Adicionalmente se les pidió a los sujetos identificar la postura más frecuente durante su trabajo mediante un diagrama. Resultados: los síntomas más prevalentes fueron dolor en la muñeca derecha (0,44; IC 95% 0,37 0,51), dolor en el cuello (0,43; IC95% 0,36 0,50), rigidez en el cuello (0,33; IC95% 0,26 0,40) y dolor en la mano derecha (0,36; IC95% 0,29 0,43). Se encontraron diferencias estadísticamente significativas en cuanto al género en la presencia de dolor en muñeca derecha (26,1% hombres contra 73,9% mujeres; p=0,005), dolor en mano derecha (25% hombres versus 75% mujeres; p=0,008), síntomas neurológicos en mano derecha (19,4% versus 80,6%; p=0,001) y dolor en hombro derecho (26,3% hombres versus 73,7% mujeres; p=0,048). También se evidencio una diferencia estadísticamente significativa en la prevalencia del síntoma dolor en muñeca derecha según el auto reporte de mayor exigencia en el desempeño (85,2% con la percepción de mayor exigencias, versus 14,8% en los sujetos que no; p=0,020). Además una diferencia estadísticamente significativa con mayor presencia de síntomas en muñecas y manos en sujetos con postura en dorsiflexión de de las mismas (muñeca derecha 72,8%, p=0,001; muñeca izquierda 43,5%, p=0,020; mano derecha 62%, p=0,003). Conclusión: después de realizar el estudio se encontró como principal síntoma el dolor, localizado en: la muñeca derecha, el cuello, la mano derecha y el hombro derecho, con diferencias mayores para el género femenino según la postura de las muñecas, lo que es compatible con las condiciones de trabajo y la respuesta fisiológica a estas condiciones.
Resumo:
El trabajo inquiere, examina y condena la concepción bíblico-antigua con el cual se ha pretendido acrisolar a las convenciones jurídicas bajo el estigma de la objetividad, especialmente en el contexto de la Responsabilidad Civil Extracontractual de la Administración Pública, contrastado con el débito acaecido una vez presente un juicio orgánico, esto es, la falla del servicio, supuesto extraído en el interactuar antitético de las autoridades públicas respecto de una obligación primaria de cuidado debido, esto es, una norma legal pre-constituida al momento de la actuación material o acto administrativo. Supuesta tal objetividad en la falla del servicio, una dicotomía se presupone en su contra, la elasticidad con que ésta teoría pretende amalgamar bajo unos mismos y nunca cambiantes elementos de imputación jurídica en manos del juez siempre una única responsabilidad en las autoridades públicas. Y por otro lado, la incidencia de la teoría en la praxis, pragmática que no deja de insinuar lo contrario, esto es, supone desavenencias inexpugnables en la aplicación de tales premisas supuestas objetivas. Luego, resultado de este cisma es la indeterminación en la concepción falla del servicio, como concepto que no conoce contornos definidos, lo cual ofrece como síntoma de su abstrusa constitución una espiritual-subjetiva aplicación, y un cuestionamiento ineludible ¿existen líneas jurisprudenciales cuando de responsabilidad extracontractual se trata, en particular, en la falla del servicio?
Resumo:
Objetivo: Reportar la experiencia obtenida con la realización de Nefrectomía Parcial para el tratamiento de masas renales en la Fundación Santa Fe de Bogotá y Fundación Cardioinfantil. Materiales y Métodos: Se revisó el registro de procedimientos quirúrgicos de las dos instituciones entre enero de 2005 y marzo de 2011. Se incluyeron los pacientes llevados a nefrectomía parcial y se revisaron las historias clínicas. Se excluyeron pacientes operados por patología no tumoral. Se registraron variables preoperatorias, intraoperatorias y postoperatorias. Se creó una base de datos en Excel y se elaboró un análisis descriptivo de las variables utilizando el paquete estadístico Stata 10.0. Resultados: Se realizaron un total de 63 nefrectomías parciales. Se analizaron 59 que fueron realizadas por sospecha de patología tumoral (quistes complejos o cáncer). El promedio de edad fué 60.7 años. En los pacientes con sospecha de tumor renal, la principal indicación para el procedimiento conservador fue el tamaño de la masa (82.53%). La creatinina preoperatoria fue 1.01 mg/dl en promedio. El abordaje utilizado con mayor frecuencia fue la lumbotomía (89.8%). En 79.6% de los casos se realizó isquemia fría. El sangrado fue 354 cc en promedio. En 6.77 % de los pacientes fue necesario ampliar el margen. El diagnóstico definitivo más frecuente fue carcinoma de células claras en el 72.8% de los casos. La creatinina postoperatoria fue de 1.14mg/dl en promedio. Un 98.3% de los pacientes permanecen libres de recaída con un seguimiento promedio de 27 meses. Conclusiones: La nefrectomía parcial por abordaje a cielo abierto es un procedimiento oncológicamente efectivo, con baja morbimortalidad y que permite mantener la función renal, en manos experimentadas. Es considerada el patrón de oro para el tratamiento de masas renales en estado T1a, tumores bilaterales, en pacientes con falla renal o en aquellos que tengan enfermedades que potencialmente la afecten.