919 resultados para Diagnostic Algorithm Development


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Development of a Sensorimotor Algorithm Able to Deal with Unforeseen Pushes and Its Implementation Based on VHDL is the title of my thesis which concludes my Bachelor Degree in the Escuela Técnica Superior de Ingeniería y Sistemas de Telecomunicación of the Universidad Politécnica de Madrid. It encloses the overall work I did in the Neurorobotics Research Laboratory from the Beuth Hochschule für Technik Berlin during my ERASMUS year in 2015. This thesis is focused on the field of robotics, specifically an electronic circuit called Cognitive Sensorimotor Loop (CSL) and its control algorithm based on VHDL hardware description language. The reason that makes the CSL special resides in its ability to operate a motor both as a sensor and an actuator. This way, it is possible to achieve a balanced position in any of the robot joints (e.g. the robot manages to stand) without needing any conventional sensor. In other words, the back electromotive force (EMF) induced by the motor coils is measured and the control algorithm responds depending on its magnitude. The CSL circuit contains mainly an analog-to-digital converter (ADC) and a driver. The ADC consists on a delta-sigma modulation which generates a series of bits with a certain percentage of 1's and 0's, proportional to the back EMF. The control algorithm, running in a FPGA, processes the bit frame and outputs a signal for the driver. This driver, which has an H bridge topology, gives the motor the ability to rotate in both directions while it's supplied with the power needed. The objective of this thesis is to document the experiments and overall work done on push ignoring contractive sensorimotor algorithms, meaning sensorimotor algorithms that ignore large magnitude forces (compared to gravity) applied in a short time interval on a pendulum system. This main objective is divided in two sub-objectives: (1) developing a system based on parameterized thresholds and (2) developing a system based on a push bypassing filter. System (1) contains a module that outputs a signal which blocks the main Sensorimotor algorithm when a push is detected. This module has several different parameters as inputs e.g. the back EMF increment to consider a force as a push or the time interval between samples. System (2) consists on a low-pass Infinite Impulse Response digital filter. It cuts any frequency considered faster than a certain push oscillation. This filter required an intensive study on how to implement some functions and data types (fixed or floating point data) not supported by standard VHDL packages. Once this was achieved, the next challenge was to simplify the solution as much as possible, without using non-official user made packages. Both systems behaved with a series of interesting advantages and disadvantages for the elaboration of the document. Stability, reaction time, simplicity or computational load are one of the many factors to be studied in the designed systems. RESUMEN. Development of a Sensorimotor Algorithm Able to Deal with Unforeseen Pushes and Its Implementation Based on VHDL es un Proyecto de Fin de Grado (PFG) que concluye mis estudios en la Escuela Técnica Superior de Ingeniería y Sistemas de Telecomunicación de la Universidad Politécnica de Madrid. En él se documenta el trabajo de investigación que realicé en el Neurorobotics Research Laboratory de la Beuth Hochschule für Technik Berlin durante el año 2015 mediante el programa de intercambio ERASMUS. Este PFG se centra en el campo de la robótica y en concreto en un circuito electrónico llamado Cognitive Sensorimotor Loop (CSL) y su algoritmo de control basado en lenguaje de modelado hardware VHDL. La particularidad del CSL reside en que se consigue que un motor haga las veces tanto de sensor como de actuador. De esta manera es posible que las articulaciones de un robot alcancen una posición de equilibrio (p.ej. el robot se coloca erguido) sin la necesidad de sensores en el sentido estricto de la palabra. Es decir, se mide la propia fuerza electromotriz (FEM) inducida sobre el motor y el algoritmo responde de acuerdo a su magnitud. El circuito CSL se compone de un convertidor analógico-digital (ADC) y un driver. El ADC consiste en un modulador sigma-delta, que genera una serie de bits con un porcentaje de 1's y 0's determinado, en proporción a la magnitud de la FEM inducida. El algoritmo de control, que se ejecuta en una FPGA, procesa esta cadena de bits y genera una señal para el driver. El driver, que posee una topología en puente H, provee al motor de la potencia necesaria y le otorga la capacidad de rotar en cualquiera de las dos direcciones. El objetivo de este PFG es documentar los experimentos y en general el trabajo realizado en algoritmos Sensorimotor que puedan ignorar fuerzas de gran magnitud (en comparación con la gravedad) y aplicadas en una corta ventana de tiempo. En otras palabras, ignorar empujones conservando el comportamiento original frente a la gravedad. Para ello se han desarrollado dos sistemas: uno basado en umbrales parametrizados (1) y otro basado en un filtro de corte ajustable (2). El sistema (1) contiene un módulo que, en el caso de detectar un empujón, genera una señal que bloquea el algoritmo Sensorimotor. Este módulo recibe diferentes parámetros como el incremento necesario de la FEM para que se considere un empujón o la ventana de tiempo para que se considere la existencia de un empujón. El sistema (2) consiste en un filtro digital paso-bajo de respuesta infinita que corta cualquier variación que considere un empujón. Para crear este filtro se requirió un estudio sobre como implementar ciertas funciones y tipos de datos (coma fija o flotante) no soportados por las librerías básicas de VHDL. Tras esto, el objetivo fue simplificar al máximo la solución del problema, sin utilizar paquetes de librerías añadidos. En ambos sistemas aparecen una serie de ventajas e inconvenientes de interés para el documento. La estabilidad, el tiempo de reacción, la simplicidad o la carga computacional son algunas de las muchos factores a estudiar en los sistemas diseñados. Para concluir, también han sido documentadas algunas incorporaciones a los sistemas: una interfaz visual en VGA, un módulo que compensa el offset del ADC o la implementación de una batería de faders MIDI entre otras.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Macromolecular transport systems in bacteria currently are classified by function and sequence comparisons into five basic types. In this classification system, type II and type IV secretion systems both possess members of a superfamily of genes for putative NTP hydrolase (NTPase) proteins that are strikingly similar in structure, function, and sequence. These include VirB11, TrbB, TraG, GspE, PilB, PilT, and ComG1. The predicted protein product of tadA, a recently discovered gene required for tenacious adherence of Actinobacillus actinomycetemcomitans, also has significant sequence similarity to members of this superfamily and to several unclassified and uncharacterized gene products of both Archaea and Bacteria. To understand the relationship of tadA and tadA-like genes to those encoding the putative NTPases of type II/IV secretion, we used a phylogenetic approach to obtain a genealogy of 148 NTPase genes and reconstruct a scenario of gene superfamily evolution. In this phylogeny, clear distinctions can be made between type II and type IV families and their constituent subfamilies. In addition, the subgroup containing tadA constitutes a novel and extremely widespread subfamily of the family encompassing all putative NTPases of type IV secretion systems. We report diagnostic amino acid residue positions for each major monophyletic family and subfamily in the phylogenetic tree, and we propose an easy method for precisely classifying and naming putative NTPase genes based on phylogeny. This molecular key-based method can be applied to other gene superfamilies and represents a valuable tool for genome analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An emerging public health phenomenon is the increasing incidence of methicillin-resistant Staphylococcus aureus (MRSA) infections that are acquired outside of health care facilities. One lineage of community-acquired MRSA (CA-MRSA) is known as the Western Samoan phage pattern (WSPP) clone. The central aim of this study was to develop an efficient genotyping procedure for the identification of WSPP isolates. The approach taken was to make use of the highly variable region downstream of mecA in combination with a single nucleotide polymorphism (SNP) defined by the S. aureus multilocus sequence typing (MLST) database. The premise was that a combinatorial genotyping method that interrogated both a highly variable region and the genomic backbone would deliver a high degree of informative power relative to the number of genetic polymorphisms-interrogated. Thirty-five MRSA isolates were used for this study, and their gene contents and order downstream of mecA were determined. The CA-MRSA isolates were found to contain a truncated mecA downstream region consisting of mecA-HVR-IS431 mec-dcs-Ins117, and a PCR-based method for identifying this structure was developed. The hospital-acquired isolates were found to contain eight different mecA downstream regions, three of which were novel. The Minimum SNPs computer software program was used to mine the S. aureus MLST database, and the arcC 2726 polymorph was identified as 82% discriminatory for ST-30. A real-time PCR assay was developed to interrogate this SNP. We found that the assay for the truncated mecA downstream region in combination with the interrogation of arcC position 272 provided an unambiguous identification of WSPP isolates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aims paper describes the background to the establishment of the Substance Use Disorders Workgroup, which was charged with developing the research agenda for the development of the next edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM). It summarizes 18 articles that were commissioned to inform that process. Methods A preliminary list of research topics, developed at the DSM-V Launch Conference in 2004, led to the identification of subjects that were subject to formal presentations and detailed discussion at the Substance Use Disorders Conference in February 2005. Results The 18 articles presented in this supplement examine: (1) categorical versus dimensional diagnoses; (2) the neurobiological basis of substance use disorders; (3) social and cultural perspectives; (4) the crosswalk between DSM-IV and the International Classification of Diseases Tenth Revision (ICD-10); (5) comorbidity of substance use disorders and mental health disorders; (6) subtypes of disorders; (7) issues in adolescence; (8) substance-specific criteria; (9) the place of non-substance addictive disorders; and (10) the available research resources. Conclusions In the final paper a broadly based research agenda for the development of diagnostic concepts and criteria for substance use disorders is presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper considers the problem of concept generalization in decision-making systems where such features of real-world databases as large size, incompleteness and inconsistence of the stored information are taken into account. The methods of the rough set theory (like lower and upper approximations, positive regions and reducts) are used for the solving of this problem. The new discretization algorithm of the continuous attributes is proposed. It essentially increases an overall performance of generalization algorithms and can be applied to processing of real value attributes in large data tables. Also the search algorithm of the significant attributes combined with a stage of discretization is developed. It allows avoiding splitting of continuous domains of insignificant attributes into intervals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Knowledge-based radiation treatment is an emerging concept in radiotherapy. It

mainly refers to the technique that can guide or automate treatment planning in

clinic by learning from prior knowledge. Dierent models are developed to realize

it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This

model can automatically determine both beam conguration and optimization ob-

jectives with non-coplanar beams based on patient-specic anatomical information.

Although plans automatically generated by this model demonstrate equivalent or

better dosimetric quality compared to clinical approved plans, its validity and gener-

ality are limited due to the empirical assignment to a coecient called angle spread

constraint dened in the beam eciency index used for beam ranking. To eliminate

these limitations, a systematic study on this coecient is needed to acquire evidences

for its optimal value.

To achieve this purpose, eleven lung cancer patients with complex tumor shape

with non-coplanar beams adopted in clinical approved plans were retrospectively

studied in the frame of the automatic lung IMRT treatment algorithm. The primary

and boost plans used in three patients were treated as dierent cases due to the

dierent target size and shape. A total of 14 lung cases, thus, were re-planned using

the knowledge-based automatic lung IMRT planning algorithm by varying angle

spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency

index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good

as possible. Important dosimetric parameters for PTV and OARs, quantitatively

re

ecting the plan quality, were extracted from the DVHs and analyzed as a function

of angle spread constraint for each case. Comparisons of these parameters between

clinical plans and model-based plans were evaluated by two-sampled Students t-tests,

and regression analysis on a composite index built on the percentage errors between

dosimetric parameters in the model-based plans and those in the clinical plans as a

function of angle spread constraint was performed.

Results show that model-based plans generally have equivalent or better quality

than clinical approved plans, qualitatively and quantitatively. All dosimetric param-

eters except those for lungs in the automatically generated plans are statistically

better or comparable to those in the clinical plans. On average, more than 15% re-

duction on conformity index and homogeneity index for PTV and V40, V60 for heart

while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The

intra-plan comparison among model-based plans demonstrates that plan quality does

not change much with angle spread constraint larger than 0.4. Further examination

on the variation curve of the composite index as a function of angle spread constraint

shows that 0.6 is the optimal value that can result in statistically the best achievable

plans.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The effectiveness of an optimization algorithm can be reduced to its ability to navigate an objective function’s topology. Hybrid optimization algorithms combine various optimization algorithms using a single meta-heuristic so that the hybrid algorithm is more robust, computationally efficient, and/or accurate than the individual algorithms it is made of. This thesis proposes a novel meta-heuristic that uses search vectors to select the constituent algorithm that is appropriate for a given objective function. The hybrid is shown to perform competitively against several existing hybrid and non-hybrid optimization algorithms over a set of three hundred test cases. This thesis also proposes a general framework for evaluating the effectiveness of hybrid optimization algorithms. Finally, this thesis presents an improved Method of Characteristics Code with novel boundary conditions, which better characterizes pipelines than previous codes. This code is coupled with the hybrid optimization algorithm in order to optimize the operation of real-world piston pumps.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Unmanned aerial vehicles (UAVs) frequently operate in partially or entirely unknown environments. As the vehicle traverses the environment and detects new obstacles, rapid path replanning is essential to avoid collisions. This thesis presents a new algorithm called Hierarchical D* Lite (HD*), which combines the incremental algorithm D* Lite with a novel hierarchical path planning approach to replan paths sufficiently fast for real-time operation. Unlike current hierarchical planning algorithms, HD* does not require map corrections before planning a new path. Directional cost scale factors, path smoothing, and Catmull-Rom splines are used to ensure the resulting paths are feasible. HD* sacrifices optimality for real-time performance. Its computation time and path quality are dependent on the map size, obstacle density, sensor range, and any restrictions on planning time. For the most complex scenarios tested, HD* found paths within 10% of optimal in under 35 milliseconds.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Rainflow counting methods convert a complex load time history into a set of load reversals for use in fatigue damage modeling. Rainflow counting methods were originally developed to assess fatigue damage associated with mechanical cycling where creep of the material under load was not considered to be a significant contributor to failure. However, creep is a significant factor in some cyclic loading cases such as solder interconnects under temperature cycling. In this case, fatigue life models require the dwell time to account for stress relaxation and creep. This study develops a new version of the multi-parameter rainflow counting algorithm that provides a range-based dwell time estimation for use with time-dependent fatigue damage models. To show the applicability, the method is used to calculate the life of solder joints under a complex thermal cycling regime and is verified by experimental testing. An additional algorithm is developed in this study to provide data reduction in the results of the rainflow counting. This algorithm uses a damage model and a statistical test to determine which of the resultant cycles are statistically insignificant to a given confidence level. This makes the resulting data file to be smaller, and for a simplified load history to be reconstructed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The lack of satisfactory consensus for characterizing the system intelligence and structured analytical decision models has inhibited the developers and practitioners to understand and configure optimum intelligent building systems in a fully informed manner. So far, little research has been conducted in this aspect. This research is designed to identify the key intelligent indicators, and develop analytical models for computing the system intelligence score of smart building system in the intelligent building. The integrated building management system (IBMS) was used as an illustrative example to present a framework. The models presented in this study applied the system intelligence theory, and the conceptual analytical framework. A total of 16 key intelligent indicators were first identified from a general survey. Then, two multi-criteria decision making (MCDM) approaches, the analytic hierarchy process (AHP) and analytic network process (ANP), were employed to develop the system intelligence analytical models. Top intelligence indicators of IBMS include: self-diagnostic of operation deviations; adaptive limiting control algorithm; and, year-round time schedule performance. The developed conceptual framework was then transformed to the practical model. The effectiveness of the practical model was evaluated by means of expert validation. The main contribution of this research is to promote understanding of the intelligent indicators, and to set the foundation for a systemic framework that provide developers and building stakeholders a consolidated inclusive tool for the system intelligence evaluation of the proposed components design configurations.