80 resultados para Telephone, Automatic
Resumo:
Color segmentation of images usually requires a manual selection and classification of samples to train the system. This paper presents an automatic system that performs these tasks without the need of a long training, providing a useful tool to detect and identify figures. In real situations, it is necessary to repeat the training process if light conditions change, or if, in the same scenario, the colors of the figures and the background may have changed, being useful a fast training method. A direct application of this method is the detection and identification of football players.
Resumo:
Realising high performance image and signal processing
applications on modern FPGA presents a challenging implementation problem due to the large data frames streaming through these systems. Specifically, to meet the high bandwidth and data storage demands of these applications, complex hierarchical memory architectures must be manually specified
at the Register Transfer Level (RTL). Automated approaches which convert high-level operation descriptions, for instance in the form of C programs, to an FPGA architecture, are unable to automatically realise such architectures. This paper
presents a solution to this problem. It presents a compiler to automatically derive such memory architectures from a C program. By transforming the input C program to a unique dataflow modelling dialect, known as Valved Dataflow (VDF), a mapping and synthesis approach developed for this dialect can
be exploited to automatically create high performance image and video processing architectures. Memory intensive C kernels for Motion Estimation (CIF Frames at 30 fps), Matrix Multiplication (128x128 @ 500 iter/sec) and Sobel Edge Detection (720p @ 30 fps), which are unrealisable by current state-of-the-art C-based synthesis tools, are automatically derived from a C description of the algorithm.
Resumo:
The efficient development of multi-threaded software has, for many years, been an unsolved problem in computer science. Finding a solution to this problem has become urgent with the advent of multi-core processors. Furthermore, the problem has become more complicated because multi-cores are everywhere (desktop, laptop, embedded system). As such, they execute generic programs which exhibit very different characteristics than the scientific applications that have been the focus of parallel computing in the past.
Implicitly parallel programming is an approach to parallel pro- gramming that promises high productivity and efficiency and rules out synchronization errors and race conditions by design. There are two main ingredients to implicitly parallel programming: (i) a con- ventional sequential programming language that is extended with annotations that describe the semantics of the program and (ii) an automatic parallelizing compiler that uses the annotations to in- crease the degree of parallelization.
It is extremely important that the annotations and the automatic parallelizing compiler are designed with the target application do- main in mind. In this paper, we discuss the Paralax approach to im- plicitly parallel programming and we review how the annotations and the compiler design help to successfully parallelize generic programs. We evaluate Paralax on SPECint benchmarks, which are a model for such programs, and demonstrate scalable speedups, up to a factor of 6 on 8 cores.
Resumo:
In this paper, a novel approach to automatically sub-divide a complex geometry and apply an efficient mesh is presented. Following the identification and removal of thin-sheet regions from an arbitrary solid using the thick/thin decomposition approach developed by Robinson et al. [1], the technique here employs shape metrics generated using local sizing measures to identify long-slender regions within the thick body. A series of algorithms automatically partition the thick region into a non-manifold assembly of long-slender and complex sub-regions. A structured anisotropic mesh is applied to the thin-sheet and long-slender bodies, and the remaining complex bodies are filled with unstructured isotropic tetrahedra. The resulting semi-structured mesh possesses significantly fewer degrees of freedom than the equivalent unstructured mesh, demonstrating the effectiveness of the approach. The accuracy of the efficient meshes generated for a complex geometry is verified via a study that compares the results of a modal analysis with the results of an equivalent analysis on a dense tetrahedral mesh.
Resumo:
Composite damage modelling with cohesive elements has initially been limited to the analysis of interface damage or delamination. However, their use is also being extended to the analysis of inplane tensile failure arising from matrix or fibre fracture. These interface elements are typically placed at locations where failure is likely to occur, which infers a certain a priori knowledge of the crack propagation path(s). In the case of a crack jump for example, the location of the jump is usually not obvious, and the simulation would require the placement of cohesive elements at all element faces. A better option, presented here, is to determine the potential location of cohesive elements and insert them during the analysis. The aim of this work is to enable the determination of the crack path, as part of the solution process. A subroutine has been developed and implemented in the commercial finite element package ABAQUS/Standard[1] in order to automatically insert cohesive elements within a pristine model, on the basis of the analysis of the current stress field. Results for the prediction of delamination are presented in this paper.
Resumo:
The world is changing. Advances in telecommunications have meant that the world is shrinking – data can be moved across continents in the time it takes to send an email or access the cloud. Although developments such as these highlight the extent of scientific and technological evolution, in terms of legal liability, questions must be asked as to the capacity of our legal structures to evolve accordingly.
This article looks at how emergency telephone provision and any shift to VoIP systems might fit with existing tort liability and associated duty implications. It does so by analysing the technology through the principles that signpost UK tort law. This article recognises that as an emerging area, the legal liability implications have not yet been discussed in any great detail. The aim of this article therefore is to introduce the area, encourage debate and consider the issues that may become increasingly relevant as these types of technologies become industrial standards.
Resumo:
The creation of idealised, dimensionally reduced meshes for preliminary design and optimisation remains a time-consuming, manual task. A dimensionally reduced model is ideal for assessing design changes through modification of element properties without the need to create a new geometry or mesh. In this paper, a novel approach for automating the creation of mixed dimensional meshes is presented. The input to the process is a solid model which has been decomposed into a non-manifold assembly of smaller volumes with different meshing significance. Associativity between the original solid model and the dimensionally reduced equivalent is maintained. The approach is validated by means of a free-free modal analysis on an output mesh of a gas turbine engine component of industrial complexity. Extensions and enhancements to this work are also discussed.
Resumo:
Background: The self-reported use of natural health products (NHPs) (herbal products and vitamin and mineral supplements) has increased over the past decade in Canada. Because the elderly population might have comorbidities and concurrently administered medications, there is a need to explore the perceptions and behaviors associated with NHPs in this age group. Objective: The goal of this study was to assess the use of NHPs in a cohort of older Canadian residents and the characteristics, perceptions, and behaviors associated with NHP use. Methods: Survey participants aged =60 years were randomly selected from telephone listings in the area of greater Hamilton, Ontario, Canada. Data were collected using a standardized computer-assisted telephone interview system. Self-reported data covering 7 domains were collected: (1) demographics; (2) self-reported 12-month NHP use; (3) reasons for NHP use; (4) self-reported 12-month prescription medication use; (5) expenditures on NHPs; (6) patient-reported adverse events and drug-NHP interactions; and (7) perceptions of physicians' attitudes regarding NHPs. Descriptive statistics were used to compare the characteristics of NHP users with those of nonusers and to assess the characteristics of NHP users across these 7 domains. Multivariate regression analysis was conducted to determine the demographic variables that might be associated with NHP user status. Results: Of 2528 persons identified as age =60 years, 1206 (48%) completed the telephone interview. Six hundred sixteen of these respondents (51%) reported the use of =1 NHP during the previous 12 months. On the initial univariate analysis, younger age and higher income were significantly associated with reporting NHP use (mean age, users vs nonusers, 71.1 vs 72.7 years, respectively; 95% CI, 1.02-1.06; P <0.001; income more than Can $26,000 was 28% and 22% in users and nonusers, respectively; P = 0.028). One hundred seventy of 616 users (28%) used an NHP to treat the same condition for which they were concurrently receiving a prescription medication, and 43 (25%) had not informed their physicians about their NHP use. Patients' characteristics such as sex, education, smoking status, and self-reported health status did not differ significantly between users and nonusers. In individuals who regularly spent money to purchase NHPs (n = 394), the mean cost was $20.38/mo. NHP expenditure was not significantly associated with age, sex, or income. Conclusion: Based on these findings, a substantial proportion of those Ontarians aged =60 years reported NHP use, and there is a need for greater communication with physicians to avoid potential drug-NHP interactions. © 2009 Excerpta Medica Inc. All rights reserved.
Resumo:
Unmanned surface vehicles are becoming increasingly vital tools in a variety of maritime applications. Unfortunately, their usability is severely constrained by the lack of a reliable obstacle detection and avoidance system. In this article, one such experimental platform is proposed, which performs obstacle detection, risk assessment and path planning (avoidance) tasks autonomously in an integrated manner. The detection system is based on a vision-LIDAR (light detection and ranging) system, whereas a heuristic path planner is utilised. A unique property of the path planner is its compliance with the marine collision regulations. It is demonstrated through hardware-in-the-loop simulations that the proposed system can be useful for both uninhabited and manned vessels.