958 resultados para implementation method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Higher visual functions can be defined as cognitive processes responsible for object recognition, color and shape perception, and motion detection. People with impaired higher visual functions after unilateral brain lesion are often tested with paper pencil tests, but such tests do not assess the degree of interaction between the healthy brain hemisphere and the impaired one. Hence, visual functions are not tested separately in the contralesional and ipsilesional visual hemifields. METHODS: A new measurement setup, that involves real-time comparisons of shape and size of objects, orientation of lines, speed and direction of moving patterns, in the right or left visual hemifield, has been developed. The setup was implemented in an immersive environment like a hemisphere to take into account the effects of peripheral and central vision, and eventual visual field losses. Due to the non-flat screen of the hemisphere, a distortion algorithm was needed to adapt the projected images to the surface. Several approaches were studied and, based on a comparison between projected images and original ones, the best one was used for the implementation of the test. Fifty-seven healthy volunteers were then tested in a pilot study. A Satisfaction Questionnaire was used to assess the usability of the new measurement setup. RESULTS: The results of the distortion algorithm showed a structural similarity between the warped images and the original ones higher than 97%. The results of the pilot study showed an accuracy in comparing images in the two visual hemifields of 0.18 visual degrees and 0.19 visual degrees for size and shape discrimination, respectively, 2.56° for line orientation, 0.33 visual degrees/s for speed perception and 7.41° for recognition of motion direction. The outcome of the Satisfaction Questionnaire showed a high acceptance of the battery by the participants. CONCLUSIONS: A new method to measure higher visual functions in an immersive environment was presented. The study focused on the usability of the developed battery rather than the performance at the visual tasks. A battery of five subtasks to study the perception of size, shape, orientation, speed and motion direction was developed. The test setup is now ready to be tested in neurological patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes of porosity, permeability, and tortuosity due to physical and geochemical processes are of vital importance for a variety of hydrogeological systems, including passive treatment facilities for contaminated groundwater, engineered barrier systems (EBS), and host rocks for high-level nuclear waste (HLW) repositories. Due to the nonlinear nature and chemical complexity of the problem, in most cases, it is impossible to verify reactive transport codes analytically, and code intercomparisons are the most suitable method to assess code capabilities and model performance. This paper summarizes model intercomparisons for six hypothetical scenarios with generally increasing geochemical or physical complexity using the reactive transport codes CrunchFlow, HP1, MIN3P, PFlotran, and TOUGHREACT. Benchmark problems include the enhancement of porosity and permeability through mineral dissolution, as well as near complete clogging due to localized mineral precipitation, leading to reduction of permeability and tortuosity. Processes considered in the benchmark simulations are advective-dispersive transport in saturated media, kinetically controlled mineral dissolution-precipitation, and aqueous complexation. Porosity changes are induced by mineral dissolution-precipitation reactions, and the Carman-Kozeny relationship is used to describe changes in permeability as a function of porosity. Archie’s law is used to update the tortuosity and the pore diffusion coefficient as a function of porosity. Results demonstrate that, generally, good agreement is reached amongst the computer models despite significant differences in model formulations. Some differences are observed, in particular for the more complex scenarios involving clogging; however, these differences do not affect the interpretation of system behavior and evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Public participation is an integral part of Environmental Impact Assessment (EIA), and as such, has been incorporated into regulatory norms. Assessment of the effectiveness of public participation has remained elusive however. This is partly due to the difficulty in identifying appropriate effectiveness criteria. This research uses Q methodology to discover and analyze stakeholder's social perspectives of the effectiveness of EIAs in the Western Cape, South Africa. It considers two case studies (Main Road and Saldanha Bay EIAs) for contextual participant perspectives of the effectiveness based on their experience. It further considers the more general opinion of provincial consent regulator staff at the Department of Environmental Affairs and the Department of Planning (DEA&DP). Two main themes of investigation are drawn from the South African National Environmental Management Act imperative for effectiveness: firstly, the participation procedure, and secondly, the stakeholder capabilities necessary for effective participation. Four theoretical frameworks drawn from planning, politics and EIA theory are adapted to public participation and used to triangulate the analysis and discussion of the revealed social perspectives. They consider citizen power in deliberation, Habermas' preconditions for the Ideal Speech Situation (ISS), a Foucauldian perspective of knowledge, power and politics, and a Capabilities Approach to public participation effectiveness. The empirical evidence from this research shows that the capacity and contextual constraints faced by participants demand the legislative imperatives for effective participation set out in the NEMA. The implementation of effective public participation has been shown to be a complex, dynamic and sometimes nebulous practice. The functional level of participant understanding of the process was found to be significantly wide-ranging with consequences of unequal and dissatisfied stakeholder engagements. Furthermore, the considerable variance of stakeholder capabilities in the South African social context, resulted in inequalities in deliberation. The social perspectives revealed significant differences in participant experience in terms of citizen power in deliberation. The ISS preconditions are highly contested in both the Saldanha EIA case study and the DEA&DP social perspectives. Only one Main Road EIA case study social perspective considered Foucault's notion of governmentality as a reality in EIA public participation. The freedom of control of ones environment, based on a Capabilities approach, is a highly contested notion. Although agreed with in principle, all of the social perspectives indicate that contextual and capacity realities constrain its realisation. This research has shown that Q method can be applied to EIA public participation in South Africa and, with the appropriate research or monitoring applications it could serve as a useful feedback tool to inform best practice public participation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tabled evaluation has been proved an effective method to improve several aspeets of goal-oriented query evaluation, including termination and complexity. Several "native" implementations of tabled evaluation have been developed which offer good performance, but many of them need significant changes to the underlying Prolog implementation. More portable approaches, generally using program transformation, have been proposed but they often result in lower efficieney. We explore some techniques aimed at combining the best of these worlds, i.e., developing a portable and extensible implementation, with minimal modifications at the abstract machine level, and with reasonably good performance. Our preliminary results indícate promising results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While negation has been a very active área of research in logic programming, comparatively few papers have been devoted to implementation issues. Furthermore, the negation-related capabilities of current Prolog systems are limited. We recently presented a novel method for incorporating negation in a Prolog compiler which takes a number of existing methods (some modified and improved) and uses them in a combined fashion. The method makes use of information provided by a global analysis of the source code. Our previous work focused on the systematic description of the techniques and the reasoning about correctness and completeness of the method, but provided no experimental evidence to evalúate the proposal. In this paper, after proposing some extensions to the method, we provide experimental data which indicates that the method is not only feasible but also quite promising from the efficiency point of view. In addition, the tests have provided new insight as to how to improve the proposal further. Abstract interpretation techniques (in particular those included in the Ciao Prolog system preprocessor) have had a significant role in the success of the technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Direct Boundary Element Method (DBEM) is presented to solve the elastodynamic field equations in 2D, and a complete comprehensive implementation is given. The DBEM is a useful approach to obtain reliable numerical estimates of site effects on seismic ground motion due to irregular geological configurations, both of layering and topography. The method is based on the discretization of the classical Somigliana's elastodynamic representation equation which stems from the reciprocity theorem. This equation is given in terms of the Green's function which is the full-space harmonic steady-state fundamental solution. The formulation permits the treatment of viscoelastic media, therefore site models with intrinsic attenuation can be examined. By means of this approach, the calculation of 2D scattering of seismic waves, due to the incidence of P and SV waves on irregular topographical profiles is performed. Sites such as, canyons, mountains and valleys in irregular multilayered media are computed to test the technique. The obtained transfer functions show excellent agreement with already published results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From the educational point of view, the most widespread method in developing countries is on-site education. Technical and economic resources cannot support conventional distance learning infrastructures and it is even worse for courses in universities. They usually suffer a lack of qualified faculty staff, especially in technical degrees. The literature suggest that e-learning is a suitable solution for this problem, but its methods are developed attending to educational necessities of the First World and cannot be applied directly to other contexts. The proposed methodology is a variant of traditional e-learning adapted to the needs of developing countries. E-learning for Cooperation and Development (c&d-learning) is oriented to be used for educational institutions without adequate technical or human resources. In this paper we describe the c&d-learning implementation architecture based on three main phases: hardware, communication and software; e.g. computer and technical equipping, internet accessing and e-learning platform adaptation. Proper adaptation of educational contents to c&d-learning is discussed and a real case of application in which the authors are involved is described: the Ngozi University at Burundi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The boundary element method (BEM) has been applied successfully to many engineering problems during the last decades. Compared with domain type methods like the finite element method (FEM) or the finite difference method (FDM) the BEM can handle problems where the medium extends to infinity much easier than domain type methods as there is no need to develop special boundary conditions (quiet or absorbing boundaries) or infinite elements at the boundaries introduced to limit the domain studied. The determination of the dynamic stiffness of arbitrarily shaped footings is just one of these fields where the BEM has been the method of choice, especially in the 1980s. With the continuous development of computer technology and the available hardware equipment the size of the problems under study grew and, as the flop count for solving the resulting linear system of equations grows with the third power of the number of equations, there was a need for the development of iterative methods with better performance. In [1] the GMRES algorithm was presented which is now widely used for implementations of the collocation BEM. While the FEM results in sparsely populated coefficient matrices, the BEM leads, in general, to fully or densely populated ones, depending on the number of subregions, posing a serious memory problem even for todays computers. If the geometry of the problem permits the surface of the domain to be meshed with equally shaped elements a lot of the resulting coefficients will be calculated and stored repeatedly. The present paper shows how these unnecessary operations can be avoided reducing the calculation time as well as the storage requirement. To this end a similar coefficient identification algorithm (SCIA), has been developed and implemented in a program written in Fortran 90. The vertical dynamic stiffness of a single pile in layered soil has been chosen to test the performance of the implementation. The results obtained with the 3-d model may be compared with those obtained with an axisymmetric formulation which are considered to be the reference values as the mesh quality is much better. The entire 3D model comprises more than 35000 dofs being a soil region with 21168 dofs the biggest single region. Note that the memory necessary to store all coefficients of this single region is about 6.8 GB, an amount which is usually not available with personal computers. In the problem under study the interface zone between the two adjacent soil regions as well as the surface of the top layer may be meshed with equally sized elements. In this case the application of the SCIA leads to an important reduction in memory requirements. The maximum memory used during the calculation has been reduced to 1.2 GB. The application of the SCIA thus permits problems to be solved on personal computers which otherwise would require much more powerful hardware.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter we are going to develop some aspects of the implementation of the boundary element method (BEM)in microcomputers. At the moment the BEM is established as a powerful tool for problem-solving and several codes have been developed and maintained on an industrial basis for large computers. It is also well known that one of the more attractive features of the BEM is the reduction of the discretization to the boundary of the domain under study. As drawbacks, we found the non-bandedness of the final matrix, wich is a full asymmetric one, and the computational difficulties related to obtaining the integrals which appear in the influence coefficients. Te reduction in dimensionality is crucial from the point of view of microcomputers, and we believe that it can be used to obtain competitive results against other domain methods. We shall discuss two applications in this chapter. The first one is related to plane linear elastostatic situations, and the second refers to plane potential problems. In the first case we shall present the classical isoparametric BEM approach, using linear elements to represent both the geometry and the variables. The second case shows how to implement a p-adaptive procedure using the BEM. This latter case has not been studied until recently, and we think that the future of the BEM will be related to its development and to the judicious exploitation of the graphics capabilities of modern micros. Some examples will be included to demonstrate the kind of results that can be expected and sections of printouts will show useful details of implementation. In order to broaden their applicability, these printouts have been prepared in Basic, although no doubt other languages may be more appropiate for effective implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a computer program developed to run in a micro I.B.M.-P.C. wich incorporates some features in order to optimize the number of operations needed to compute the solution of plane potential problems governed by Laplace's equation by using the Boundary Integral Equation Method (B.I.E.M.). Also incorporated is a routine to plot isolines inside the domain under study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the sensitive international situation caused by still-recent terrorist attacks, there is a common need to protect the safety of large spaces such as government buildings, airports and power stations. To address this problem, developments in several research fields, such as video and cognitive audio, decision support systems, human interface, computer architecture, communications networks and communications security, should be integrated with the goal of achieving advanced security systems capable of checking all of the specified requirements and spanning the gap that presently exists in the current market. This paper describes the implementation of a decision system for crisis management in infrastructural building security. Specifically, it describes the implementation of a decision system in the management of building intrusions. The positions of the unidentified persons are reported with the help of a Wireless Sensor Network (WSN). The goal is to achieve an intelligent system capable of making the best decision in real time in order to quickly neutralise one or more intruders who threaten strategic installations. It is assumed that the intruders’ behaviour is inferred through sequences of sensors’ activations and their fusion. This article presents a general approach to selecting the optimum operation from the available neutralisation strategies based on a Minimax algorithm. The distances among different scenario elements will be used to measure the risk of the scene, so a path planning technique will be integrated in order to attain a good performance. Different actions to be executed over the elements of the scene such as moving a guard, blocking a door or turning on an alarm will be used to neutralise the crisis. This set of actions executed to stop the crisis is known as the neutralisation strategy. Finally, the system has been tested in simulations of real situations, and the results have been evaluated according to the final state of the intruders. In 86.5% of the cases, the system achieved the capture of the intruders, and in 59.25% of the cases, they were intercepted before they reached their objective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tesis propone un sistema biométrico de geometría de mano orientado a entornos sin contacto junto con un sistema de detección de estrés capaz de decir qué grado de estrés tiene una determinada persona en base a señales fisiológicas Con respecto al sistema biométrico, esta tesis contribuye con el diseño y la implementación de un sistema biométrico de geometría de mano, donde la adquisición se realiza sin ningún tipo de contacto, y el patrón del usuario se crea considerando únicamente datos del propio individuo. Además, esta tesis propone un algoritmo de segmentación multiescala para solucionar los problemas que conlleva la adquisición de manos en entornos reales. Por otro lado, respecto a la extracción de características y su posterior comparación esta tesis tiene una contribución específica, proponiendo esquemas adecuados para llevar a cabo tales tareas con un coste computacional bajo pero con una alta precisión en el reconocimiento de personas. Por último, este sistema es evaluado acorde a la norma estándar ISO/IEC 19795 considerando seis bases de datos públicas. En relación al método de detección de estrés, esta tesis propone un sistema basado en dos señales fisiológicas, concretamente la tasa cardiaca y la conductancia de la piel, así como la creación de un innovador patrón de estrés que recoge el comportamiento de ambas señales bajo las situaciones de estrés y no-estrés. Además, este sistema está basado en lógica difusa para decidir el grado de estrés de un individuo. En general, este sistema es capaz de detectar estrés de forma precisa y en tiempo real, proporcionando una solución adecuada para sistemas biométricos actuales, donde la aplicación del sistema de detección de estrés es directa para evitar situaciónes donde los individuos sean forzados a proporcionar sus datos biométricos. Finalmente, esta tesis incluye un estudio de aceptabilidad del usuario, donde se evalúa cuál es la aceptación del usuario con respecto a la técnica biométrica propuesta por un total de 250 usuarios. Además se incluye un prototipo implementado en un dispositivo móvil y su evaluación. ABSTRACT: This thesis proposes a hand biometric system oriented to unconstrained and contactless scenarios together with a stress detection method able to elucidate to what extent an individual is under stress based on physiological signals. Concerning the biometric system, this thesis contributes with the design and implementation of a hand-based biometric system, where the acquisition is carried out without contact and the template is created only requiring information from a single individual. In addition, this thesis proposes an algorithm based on multiscale aggregation in order to tackle with the problem of segmentation in real unconstrained environments. Furthermore, feature extraction and matching are also a specific contributions of this thesis, providing adequate schemes to carry out both actions with low computational cost but with certain recognition accuracy. Finally, this system is evaluated according to international standard ISO/IEC 19795 considering six public databases. In relation to the stress detection method, this thesis proposes a system based on two physiological signals, namely heart rate and galvanic skin response, with the creation of an innovative stress detection template which gathers the behaviour of both physiological signals under both stressing and non-stressing situations. Besides, this system is based on fuzzy logic to elucidate the level of stress of an individual. As an overview, this system is able to detect stress accurately and in real-time, providing an adequate solution for current biometric systems, where the application of a stress detection system is direct to avoid situations where individuals are forced to provide the biometric data. Finally, this thesis includes a user acceptability evaluation, where the acceptance of the proposed biometric technique is assessed by a total of 250 individuals. In addition, this thesis includes a mobile implementation prototype and its evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Software Engineering (SE) community has historically focused on working with models to represent functionality and persistence, pushing interaction modelling into the background, which has been covered by the Human Computer Interaction (HCI) community. Recently, adequately modelling interaction, and specifically usability, is being considered as a key factor for success in user acceptance, making the integration of the SE and HCI communities more necessary. If we focus on the Model-Driven Development (MDD) paradigm, we notice that there is a lack of proposals to deal with usability features from the very first steps of software development process. In general, usability features are manually implemented once the code has been generated from models. This contradicts the MDD paradigm, which claims that all the analysts? effort must be focused on building models, and the code generation is relegated to model to code transformations. Moreover, usability features related to functionality may involve important changes in the system architecture if they are not considered from the early steps. We state that these usability features related to functionality can be represented abstractly in a conceptual model, and their implementation can be carried out automatically.