990 resultados para Dark objects method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract. Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characteriza- tion and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is com- bined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS in- strumental error is small enough to enable detection of pre- cursory displacements of millimetric magnitude. This con- sists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Dis- placement measurement are improved considerably by ap- plying Nearest Neighbour (NN) averaging, which reduces the error (1σ ) up to a factor of 6. This technique was ap- plied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumen- tal error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by apply- ing the NN averaging method. These results show that mil- limetric displacements prior to failure can be detected using TLS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Platelet-rich plasma (PRP) is a volume of plasma fraction of autologous blood having platelet concentrations above baseline whole-blood values due to processing and concentration. PRP is used in various surgical fields to enhance soft-tissue and bone healing by delivering supra-physiological concentrations of autologous platelets at the site of tissue damage. These preparations may provide a good cellular source of various growth factors and cytokines, and modulate tissue response to injury. Common clinically available materials for blood preparations combined with a two-step centrifugation protocol at 280g each, to ensure cellular component integrity, provided platelet preparations which were concentrated 2-3 fold over total blood values. Costs were shown to be lower than those of other methods which require specific equipment and high-cost disposables, while safety and traceability can be increased. PRP can be used for the treatment of wounds of all types including burns and also of split-thickness skin graft donor sites, which are frequently used in burn management. The procedure can be standardized and is easy to adapt in clinical settings with minimal infrastructure, thus enabling large numbers of patients to benefit from a form of cellular therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An adaptation technique based on the synoptic atmospheric circulation to forecast local precipitation, namely the analogue method, has been implemented for the western Swiss Alps. During the calibration procedure, relevance maps were established for the geopotential height data. These maps highlight the locations were the synoptic circulation was found of interest for the precipitation forecasting at two rain gauge stations (Binn and Les Marécottes) that are located both in the alpine Rhône catchment, at a distance of about 100 km from each other. These two stations are sensitive to different atmospheric circulations. We have observed that the most relevant data for the analogue method can be found where specific atmospheric circulation patterns appear concomitantly with heavy precipitation events. Those skilled regions are coherent with the atmospheric flows illustrated, for example, by means of the back trajectories of air masses. Indeed, the circulation recurrently diverges from the climatology during days with strong precipitation on the southern part of the alpine Rhône catchment. We have found that for over 152 days with precipitation amount above 50 mm at the Binn station, only 3 did not show a trajectory of a southerly flow, meaning that such a circulation was present for 98% of the events. Time evolution of the relevance maps confirms that the atmospheric circulation variables have significantly better forecasting skills close to the precipitation period, and that it seems pointless for the analogue method to consider circulation information days before a precipitation event as a primary predictor. Even though the occurrence of some critical circulation patterns leading to heavy precipitation events can be detected by precursors at remote locations and 1 week ahead (Grazzini, 2007; Martius et al., 2008), time extrapolation by the analogue method seems to be rather poor. This would suggest, in accordance with previous studies (Obled et al., 2002; Bontron and Obled, 2005), that time extrapolation should be done by the Global Circulation Model, which can process atmospheric variables that can be used by the adaptation method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Laparoscopic techniques have been proposed as an alternative to open surgery for therapy of peptic ulcer perforation. They provide better postoperative comfort and absence of parietal complications, but leakage occurs in 5% of cases. We describe a new method combining laparoscopy and endoluminal endoscopy, designed to ensure complete closure of the perforation. METHODS: Six patients with anterior ulcer perforations (4 duodenal, 2 gastric) underwent a concomitant laparoscopy and endoluminal endoscopy with closure of the orifice by an omental plug attracted into the digestive tract. RESULTS: All perforations were sealed. The mean operating time was 72 minutes. The mean hospital stay was 5.5 days. There was no morbidity and no mortality. At the 30-day evaluation all ulcers but one (due to Helicobacter pylori persistence) were healed. CONCLUSIONS: This method is safe and effective. Its advantages compared with open surgery or laparoscopic patching as well as its cost-effectiveness should be studied in prospective randomized trials.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Saffaj et al. recently criticized our method of monitoring carbon dioxide in human postmortem cardiac gas samples using Headspace-Gas Chromatography-Mass Spectrometry. According to the authors, their demonstration, based on the latest SFSTP guidelines (established after 2007 [1,2]) fitted for the validation of drug monitoring bioanalytical methods, has put in evidence potential errors. However, our validation approach was built using SFSTP guidelines established before 2007 [3-6]. We justify the use of these guidelines because of the post-mortem context of the study (and not clinical) and the gaseous state of the sample (and not solid or liquid). Using these guidelines, our validation remains correct.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces how artificial intelligence technologies can be integrated into a known computer aided control system design (CACSD) framework, Matlab/Simulink, using an object oriented approach. The aim is to build a framework to aid supervisory systems analysis, design and implementation. The idea is to take advantage of an existing CACSD framework, Matlab/Simulink, so that engineers can proceed: first to design a control system, and then to design a straightforward supervisory system of the control system in the same framework. Thus, expert systems and qualitative reasoning tools are incorporated into this popular CACSD framework to develop a computer aided supervisory system design (CASSD) framework. Object-variables an introduced into Matlab/Simulink for sharing information between tools

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given a set of images of scenes containing different object categories (e.g. grass, roads) our objective is to discover these objects in each image, and to use this object occurrences to perform a scene classification (e.g. beach scene, mountain scene). We achieve this by using a supervised learning algorithm able to learn with few images to facilitate the user task. We use a probabilistic model to recognise the objects and further we classify the scene based on their object occurrences. Experimental results are shown and evaluated to prove the validity of our proposal. Object recognition performance is compared to the approaches of He et al. (2004) and Marti et al. (2001) using their own datasets. Furthermore an unsupervised method is implemented in order to evaluate the advantages and disadvantages of our supervised classification approach versus an unsupervised one

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analytic method to evaluate nuclear contributions to electrical properties of polyatomic molecules is presented. Such contributions control changes induced by an electric field on equilibrium geometry (nuclear relaxation contribution) and vibrational motion (vibrational contribution) of a molecular system. Expressions to compute the nuclear contributions have been derived from a power series expansion of the potential energy. These contributions to the electrical properties are given in terms of energy derivatives with respect to normal coordinates, electric field intensity or both. Only one calculation of such derivatives at the field-free equilibrium geometry is required. To show the useful efficiency of the analytical evaluation of electrical properties (the so-called AEEP method), results for calculations on water and pyridine at the SCF/TZ2P and the MP2/TZ2P levels of theory are reported. The results obtained are compared with previous theoretical calculations and with experimental values

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning object economies are marketplaces for the sharing and reuse of learning objects (LO). There are many motivations for stimulating the development of the LO economy. The main reason is the possibility of providing the right content, at the right time, to the right learner according to adequate quality standards in the context of a lifelong learning process; in fact, this is also the main objective of education. However, some barriers to the development of a LO economy, such as the granularity and editability of LO, must be overcome. Furthermore, some enablers, such as learning design generation and standards usage, must be promoted in order to enhance LO economy. For this article, we introduced the integration of distributed learning object repositories (DLOR) as sources of LO that could be placed in adaptive learning designs to assist teachers’ design work. Two main issues presented as a result: how to access distributed LO, and where to place the LO in the learning design. To address these issues, we introduced two processes: LORSE, a distributed LO searching process, and LOOK, a micro context-based positioning process, respectively. Using these processes, the teachers were able to reuse LO from different sources to semi-automatically generate an adaptive learning design without leaving their virtual environment. A layered evaluation yielded good results for the process of placing learning objects from controlled learning object repositories into a learning design, and permitting educators to define different open issues that must be covered when they use uncontrolled learning object repositories for this purpose. We verified the satisfaction users had with our solution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O Lunney Scoring Method for Rating Accuracy of Nursing Diagnoses (LSM) é uma escala de diferencial semântico que foi desenvolvida por Lunney para estimar a acurácia dos diagnósticos de enfermagem. O objetivo deste estudo foi adaptar o LSM para a língua portuguesa e avaliar as sua propriedades psicométricas. A escala original foi traduzida para o português, revertida para o inglês e as duas versões em inglês foram comparadas para ajustar a versão em português que passou a ser denominada Escala de Acurácia de Diagnóstico de Enfermagem de Lunney - EADE. Quatro enfermeiras foram orientadas sobre a EADE e a aplicaram em 159 diagnósticos formulados para 26 pacientes de três estudos primários com base nos registros de entrevista e exame físico de cada paciente. Os índices Kappa de Cohen mostraram ausência de concordância entre as avaliadoras, o que indica que o instrumento adaptado não tem confiabilidade satisfatória. Em virtude desse resultado, não foi realizada estimativa de validade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple extended finite field nuclear relaxation procedure for calculating vibrational contributions to degenerate four-wave mixing (also known as the intensity-dependent refractive index) is presented. As a by-product one also obtains the static vibrationally averaged linear polarizability, as well as the first and second hyperpolarizability. The methodology is validated by illustrative calculations on the water molecule. Further possible extensions are suggested