267 resultados para Speeding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a class of distance-dependent interactions in an accelerated exclusion process inspired by the observation of transcribing RNA polymerase speeding up when “pushed” by a trailing one. On a ring, the accelerated exclusion process steady state displays a discontinuous transition, from being homogeneous (with augmented currents) to phase segregated. In the latter state, the holes appear loosely bound and move together, much like a train. Surprisingly, the current-density relation is simply J=1-ρ, signifying that the “hole train” travels with unit velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a class of distance-dependent interactions in an accelerated exclusion process inspired by the observation of transcribing RNA polymerase speeding up when “pushed” by a trailing one. On a ring, the accelerated exclusion process steady state displays a discontinuous transition, from being homogeneous (with augmented currents) to phase segregated. In the latter state, the holes appear loosely bound and move together, much like a train. Surprisingly, the current-density relation is simply J=1-ρ, signifying that the “hole train” travels with unit velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A 19-year-old man speeding recklessly along a highway caused a left-frontal crash with another car. After his vehicle came to a standstill, he climbed out of the wreck and crawled across the tarmac to the other side of the road, where he died several minutes after the accident and before the arrival of an ambulance. Postmortem multislice computed tomography (MSCT) demonstrated fractures of the first, second, and third ribs and scapula on the left, an extrapleural hemorrhage in the apical region of the left thorax, as well as a large amount of blood in the left thoracic cavity. These radiologic findings were indicative of a delayed rupture of a traumatic extrapleural hematoma into the pleural space. A traditional autopsy confirmed the very rare diagnosis of a traumatic extrapleural hemorrhage with a delayed rupture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the insatiable curiosity of human beings to explore the universe and our solar system, it is essential to benefit from larger propulsion capabilities to execute efficient transfers and carry more scientific equipment. In the field of space trajectory optimization the fundamental advances in using low-thrust propulsion and exploiting the multi-body dynamics has played pivotal role in designing efficient space mission trajectories. The former provides larger cumulative momentum change in comparison with the conventional chemical propulsion whereas the latter results in almost ballistic trajectories with negligible amount of propellant. However, the problem of space trajectory design translates into an optimal control problem which is, in general, time-consuming and very difficult to solve. Therefore, the goal of the thesis is to address the above problem by developing a methodology to simplify and facilitate the process of finding initial low-thrust trajectories in both two-body and multi-body environments. This initial solution will not only provide mission designers with a better understanding of the problem and solution but also serves as a good initial guess for high-fidelity optimal control solvers and increases their convergence rate. Almost all of the high-fidelity solvers enjoy the existence of an initial guess that already satisfies the equations of motion and some of the most important constraints. Despite the nonlinear nature of the problem, it is sought to find a robust technique for a wide range of typical low-thrust transfers with reduced computational intensity. Another important aspect of our developed methodology is the representation of low-thrust trajectories by Fourier series with which the number of design variables reduces significantly. Emphasis is given on simplifying the equations of motion to the possible extent and avoid approximating the controls. These facts contribute to speeding up the solution finding procedure. Several example applications of two and three-dimensional two-body low-thrust transfers are considered. In addition, in the multi-body dynamic, and in particular the restricted-three-body dynamic, several Earth-to-Moon low-thrust transfers are investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time is one of the scarcest resources in modern parliaments. In parliamentary systems of government the control of time in the chamber is a significant power resource enjoyed – to varying degrees – by parliamentary majorities and the governments they support. Minorities may not be able to muster enough votes to stop bills, but they may have – varying degrees of – delaying powers enabling them to extract concessions from majorities attempting to get on with their overall legislative programme. This paper provides a comparative analysis of the dynamics of the legislative process in 17 West European parliaments from the formal initiation of bills to their promulgation. The ‘biographies’ of a sample of bills are examined using techniques of event-history analysis (a) charting the dynamics of the legislative process both across the life-times of individual bills and different political systems and (b) examining whether, and to what extent, parliamentary rules and some general regime attributes influence the dynamics of this process, speeding up or delaying the passage of legislation. Using a veto-points framework and transaction cost politics as a theoretical framework, the quantitative analyses suggest a number of counter-intuitive findings (e.g., the efficiency of powerful committees) and cast doubt on some of the claims made by Tsebelis in his veto-player model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tobacco use is a major health hazard, and the onset of tobacco use occurs almost entirely in the teenage years. For this reason, schools are an ideal site for tobacco prevention programs. Although studies have shown that effective school-based tobacco prevention programs exist, all too frequently these programs are not used. In order for effective programs to achieve their potential impact, strategies for speeding the diffusion of these programs to school districts and seeing that, once adopted, programs are implemented as they are intended, must be developed and tested.^ This study (SC2) set out to replicate the findings of an earlier quasi-experimental study (The Smart Choices Diffusion Study, or SC1) in which strategies based on diffusion theory and social learning theory were found to be effective in encouraging adoption and implementation of an effective tobacco prevention program in schools. To increase awareness and encourage adoption, intervention strategies in both studies utilized opinion leaders, messages highlighting positive aspects of the program, and modeling of benefits and effective use through videotape and newsletters. To encourage accurate implementation of the curriculum, teacher training for the two studies utilized videotaped modeling and practice of activities by teachers. SC2 subjects were 38 school districts that make up one of Texas' 20 education service regions. These districts had served as the comparison group in SC1, and findings for the SC1 comparison and intervention groups were utilized as historic controls.^ SC2 achieved a 76.3% adoption rate and found that an average of 84% of the curriculum was taught with an 82% fidelity to methods utilized by the curriculum. These rates and rates for implementation of dissemination strategies were equal to or greater than corresponding rates for SC1. The proportion of teachers implementing the curriculum in SC2 was found to be equal to SC1's video-trained districts but lower than the SC1 workshop-trained group.^ SC2's findings corroborate and support the findings from the earlier study, and increase our confidence in its findings. Taken together, the findings from SC2 and SC1 point to the effectiveness of their theory-based intervention strategies in encouraging adoption and accurate implementation of the tobacco prevention curriculum. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In spite of the increasing presence of Semantic Web Facilities, only a limited amount of the available resources in the Internet provide a semantic access. Recent initiatives such as the emerging Linked Data Web are providing semantic access to available data by porting existing resources to the semantic web using different technologies, such as database-semantic mapping and scraping. Nevertheless, existing scraping solutions are based on ad-hoc solutions complemented with graphical interfaces for speeding up the scraper development. This article proposes a generic framework for web scraping based on semantic technologies. This framework is structured in three levels: scraping services, semantic scraping model and syntactic scraping. The first level provides an interface to generic applications or intelligent agents for gathering information from the web at a high level. The second level defines a semantic RDF model of the scraping process, in order to provide a declarative approach to the scraping task. Finally, the third level provides an implementation of the RDF scraping model for specific technologies. The work has been validated in a scenario that illustrates its application to mashup technologies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a novel technique for identifying logically related sections of the heap such as recursive data structures, objects that are part of the same multi-component structure, and related groups of objects stored in the same collection/array. When combined withthe lifetime properties of these structures, this information can be used to drive a range of program optimizations including pool allocation, object co-location, static deallocation, and region-based garbage collection. The technique outlined in this paper also improves the efficiency of the static analysis by providing a normal form for the abstract models (speeding the convergence of the static analysis). We focus on two techniques for grouping parts of the heap. The first is a technique for precisely identifying recursive data structures in object-oriented programs based on the types declared in the program. The second technique is a novel method for grouping objects that make up the same composite structure and that allows us to partition the objects stored in a collection/array into groups based on a similarity relation. We provide a parametric component in the similarity relation in order to support specific analysis applications (such as a numeric analysis which would need to partition the objects based on numeric properties of the fields). Using the Barnes-Hut benchmark from the JOlden suite we show how these grouping methods can be used to identify various types of logical structures allowing the application of many region-based program optimizations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The adaptation to the European Higher Educational area is an opportunity to incorporate on-line methods of education assessment in the field of the "Structural Analysis". This paper promotes the intensive use of MOODLE platform of the Technical University of Madrid, to implement the Information and Communication Technology program in this area. This article summarizes the educational experience achieved during the last courses though the use of a new software tool for continuous self-testing by the students. Individualized test are achieved by randomized variable data for each student. Several benefits for the educational purposes have arisen. Firstly, a considerable speeding of the evaluation process, and more than this, a massive input by the students. Certainly, the use of this tool can be very useful for the learning process and therefore for the improvement of the student’s skills and abilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adaptive Rejection Metropolis Sampling (ARMS) is a wellknown MCMC scheme for generating samples from onedimensional target distributions. ARMS is widely used within Gibbs sampling, where automatic and fast samplers are often needed to draw from univariate full-conditional densities. In this work, we propose an alternative adaptive algorithm (IA2RMS) that overcomes the main drawback of ARMS (an uncomplete adaptation of the proposal in some cases), speeding up the convergence of the chain to the target. Numerical results show that IA2RMS outperforms the standard ARMS, providing a correlation among samples close to zero.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

System for the management, control and monitoring of institutional meetings, is a software program for keeping documents by institutional meetings and store them electronically, speeding up the search for documents and organizing meetings , this software application is able schedule meetings of selecting date and place where the meeting take place , this type of action to be c arried out under the management of people registered software to do so, the administrator assigns permissions to each user, so you can schedule your own meetings , thus can avoid conflicts and develop in a timely manner. For a meeting, a process that includes everything from the type of meeting, status, agreements among other things will be.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the main problems relief teams face after a natural or man-made disaster is how to plan rural road repair work tasks to take maximum advantage of the limited available financial and human resources. Previous research focused on speeding up repair work or on selecting the location of health centers to minimize transport times for injured citizens. In spite of the good results, this research does not take into account another key factor: survivor accessibility to resources. In this paper we account for the accessibility issue, that is, we maximize the number of survivors that reach the nearest regional center (cities where economic and social activity is concentrated) in a minimum time by planning which rural roads should be repaired given the available financial and human resources. This is a combinatorial problem since the number of connections between cities and regional centers grows exponentially with the problem size, and exact methods are no good for achieving an optimum solution. In order to solve the problem we propose using an Ant Colony System adaptation, which is based on ants? foraging behavior. Ants stochastically build minimal paths to regional centers and decide if damaged roads are repaired on the basis of pheromone levels, accessibility heuristic information and the available budget. The proposed algorithm is illustrated by means of an example regarding the 2010 Haiti earthquake, and its performance is compared with another metaheuristic, GRASP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

LLas nuevas tecnologías orientadas a la nube, el internet de las cosas o las tendencias "as a service" se basan en el almacenamiento y procesamiento de datos en servidores remotos. Para garantizar la seguridad en la comunicación de dichos datos al servidor remoto, y en el manejo de los mismos en dicho servidor, se hace uso de diferentes esquemas criptográficos. Tradicionalmente, dichos sistemas criptográficos se centran en encriptar los datos mientras no sea necesario procesarlos (es decir, durante la comunicación y almacenamiento de los mismos). Sin embargo, una vez es necesario procesar dichos datos encriptados (en el servidor remoto), es necesario desencriptarlos, momento en el cual un intruso en dicho servidor podría a acceder a datos sensibles de usuarios del mismo. Es más, este enfoque tradicional necesita que el servidor sea capaz de desencriptar dichos datos, teniendo que confiar en la integridad de dicho servidor de no comprometer los datos. Como posible solución a estos problemas, surgen los esquemas de encriptación homomórficos completos. Un esquema homomórfico completo no requiere desencriptar los datos para operar con ellos, sino que es capaz de realizar las operaciones sobre los datos encriptados, manteniendo un homomorfismo entre el mensaje cifrado y el mensaje plano. De esta manera, cualquier intruso en el sistema no podría robar más que textos cifrados, siendo imposible un robo de los datos sensibles sin un robo de las claves de cifrado. Sin embargo, los esquemas de encriptación homomórfica son, actualmente, drás-ticamente lentos comparados con otros esquemas de encriptación clásicos. Una op¬eración en el anillo del texto plano puede conllevar numerosas operaciones en el anillo del texto encriptado. Por esta razón, están surgiendo distintos planteamientos sobre como acelerar estos esquemas para un uso práctico. Una de las propuestas para acelerar los esquemas homomórficos consiste en el uso de High-Performance Computing (HPC) usando FPGAs (Field Programmable Gate Arrays). Una FPGA es un dispositivo semiconductor que contiene bloques de lógica cuya interconexión y funcionalidad puede ser reprogramada. Al compilar para FPGAs, se genera un circuito hardware específico para el algorithmo proporcionado, en lugar de hacer uso de instrucciones en una máquina universal, lo que supone una gran ventaja con respecto a CPUs. Las FPGAs tienen, por tanto, claras difrencias con respecto a CPUs: -Arquitectura en pipeline: permite la obtención de outputs sucesivos en tiempo constante -Posibilidad de tener multiples pipes para computación concurrente/paralela. Así, en este proyecto: -Se realizan diferentes implementaciones de esquemas homomórficos en sistemas basados en FPGAs. -Se analizan y estudian las ventajas y desventajas de los esquemas criptográficos en sistemas basados en FPGAs, comparando con proyectos relacionados. -Se comparan las implementaciones con trabajos relacionados New cloud-based technologies, the internet of things or "as a service" trends are based in data storage and processing in a remote server. In order to guarantee a secure communication and handling of data, cryptographic schemes are used. Tradi¬tionally, these cryptographic schemes focus on guaranteeing the security of data while storing and transferring it, not while operating with it. Therefore, once the server has to operate with that encrypted data, it first decrypts it, exposing unencrypted data to intruders in the server. Moreover, the whole traditional scheme is based on the assumption the server is reliable, giving it enough credentials to decipher data to process it. As a possible solution for this issues, fully homomorphic encryption(FHE) schemes is introduced. A fully homomorphic scheme does not require data decryption to operate, but rather operates over the cyphertext ring, keeping an homomorphism between the cyphertext ring and the plaintext ring. As a result, an outsider could only obtain encrypted data, making it impossible to retrieve the actual sensitive data without its associated cypher keys. However, using homomorphic encryption(HE) schemes impacts performance dras-tically, slowing it down. One operation in the plaintext space can lead to several operations in the cyphertext space. Because of this, different approaches address the problem of speeding up these schemes in order to become practical. One of these approaches consists in the use of High-Performance Computing (HPC) using FPGAs (Field Programmable Gate Array). An FPGA is an integrated circuit designed to be configured by a customer or a designer after manufacturing - hence "field-programmable". Compiling into FPGA means generating a circuit (hardware) specific for that algorithm, instead of having an universal machine and generating a set of machine instructions. FPGAs have, thus, clear differences compared to CPUs: - Pipeline architecture, which allows obtaining successive outputs in constant time. -Possibility of having multiple pipes for concurrent/parallel computation. Thereby, In this project: -We present different implementations of FHE schemes in FPGA-based systems. -We analyse and study advantages and drawbacks of the implemented FHE schemes, compared to related work.