956 resultados para diagnostic and prognostic algorithms developmen
Resumo:
Using the MIT Serial Link Direct Drive Arm as the main experimental device, various issues in trajectory and force control of manipulators were studied in this thesis. Since accurate modeling is important for any controller, issues of estimating the dynamic model of a manipulator and its load were addressed first. Practical and effective algorithms were developed fro the Newton-Euler equations to estimate the inertial parameters of manipulator rigid-body loads and links. Load estimation was implemented both on PUMA 600 robot and on the MIT Serial Link Direct Drive Arm. With the link estimation algorithm, the inertial parameters of the direct drive arm were obtained. For both load and link estimation results, the estimated parameters are good models of the actual system for control purposes since torques and forces can be predicted accurately from these estimated parameters. The estimated model of the direct drive arm was them used to evaluate trajectory following performance by feedforward and computed torque control algorithms. The experimental evaluations showed that the dynamic compensation can greatly improve trajectory following accuracy. Various stability issues of force control were studied next. It was determined that there are two types of instability in force control. Dynamic instability, present in all of the previous force control algorithms discussed in this thesis, is caused by the interaction of a manipulator with a stiff environment. Kinematics instability is present only in the hybrid control algorithm of Raibert and Craig, and is caused by the interaction of the inertia matrix with the Jacobian inverse coordinate transformation in the feedback path. Several methods were suggested and demonstrated experimentally to solve these stability problems. The result of the stability analyses were then incorporated in implementing a stable force/position controller on the direct drive arm by the modified resolved acceleration method using both joint torque and wrist force sensor feedbacks.
Resumo:
In this paper a precorrected FFT-Fast Multipole Tree (pFFT-FMT) method for solving the potential flow around arbitrary three dimensional bodies is presented. The method takes advantage of the efficiency of the pFFT and FMT algorithms to facilitate more demanding computations such as automatic wake generation and hands-off steady and unsteady aerodynamic simulations. The velocity potential on the body surfaces and in the domain is determined using a pFFT Boundary Element Method (BEM) approach based on the Green’s Theorem Boundary Integral Equation. The vorticity trailing all lifting surfaces in the domain is represented using a Fast Multipole Tree, time advected, vortex participle method. Some simple steady state flow solutions are performed to demonstrate the basic capabilities of the solver. Although this paper focuses primarily on steady state solutions, it should be noted that this approach is designed to be a robust and efficient unsteady potential flow simulation tool, useful for rapid computational prototyping.
Resumo:
We compare a broad range of optimal product line design methods. The comparisons take advantage of recent advances that make it possible to identify the optimal solution to problems that are too large for complete enumeration. Several of the methods perform surprisingly well, including Simulated Annealing, Product-Swapping and Genetic Algorithms. The Product-Swapping heuristic is remarkable for its simplicity. The performance of this heuristic suggests that the optimal product line design problem may be far easier to solve in practice than indicated by complexity theory.
Resumo:
La concentración de ácido láctico en LCR en pacientes con sospecha de meningitis postquirúrgica luego de clipaje de aneurisma cerebral y hemorragia subaracnoidea espontánea se midió prospectivamente por un período de tres años. Se analizaron un total de 32 muestras de líquido cefalorraquídeo, se midió la concentración de ácido láctico y se comparó con el cultivo de LCR. Los cultivos fueron positivos en cinco pacientes, con una prevalencia de infección del 15%. Se utilizó un valor umbral de ácido láctico de 4 mmol/L. y se encontró una sensibilidad del 80%, especificidad del 52%, VPP del 23%, VPN del 93%, y likelihood ratio (LHR) positivo de 1,66 con una probabilidad post test de 15% de la concentración del ácido láctico en el diagnóstico de meningitis postquirúrgica en pacientes con hemorragia subaracnoidea aneurismática. La concentración de ácido láctico en LCR tiene un desempeño limitado en el diagnóstico de meningitis postquirúrgica en pacientes con hemorragia subaracnoidea aneurismática.
Resumo:
Capitulo de revisión de los estudios y patologias más frecuentemente valorados en neuroimágenes, con imágenes propias del archivo de neuroradiología de la Fundación Cardio Infantil y algoritmos de interpretación.
Resumo:
Non-specific Occupational Low Back Pain (NOLBP) is a health condition that generates a high absenteeism and disability. Due to multifactorial causes is difficult to determine accurate diagnosis and prognosis. The clinical prediction of NOLBP is identified as a series of models that integrate a multivariate analysis to determine early diagnosis, course, and occupational impact of this health condition. Objective: to identify predictor factors of NOLBP, and the type of material referred to in the scientific evidence and establish the scopes of the prediction. Materials and method: the title search was conducted in the databases PubMed, Science Direct, and Ebsco Springer, between1985 and 2012. The selected articles were classified through a bibliometric analysis allowing to define the most relevant ones. Results: 101 titles met the established criteria, but only 43 metthe purpose of the review. As for NOLBP prediction, the studies varied in relation to the factors for example: diagnosis, transition of lumbar pain from acute to chronic, absenteeism from work, disability and return to work. Conclusion: clinical prediction is considered as a strategic to determine course and prognostic of NOLBP, and to determine the characteristics that increase the risk of chronicity in workers with this health condition. Likewise, clinical prediction rules are tools that aim to facilitate decision making about the evaluation, diagnosis, prognosis and intervention for low back pain, which should incorporate risk factors of physical, psychological and social.
Resumo:
Introducción: El cáncer colorrectal es el tercer cáncer más diagnosticado en los hombres y el segundo en las mujeres a nivel mundial. Hasta 1.000 casos nuevos se diagnostican en Colombia cada año, por lo que es importante conocer la experiencia con esta patología en un centro de experiencia recientemente creado en el “Méderi, Hospital Universitario Mayor”. Materiales y métodos: Se realizó un estudio de corte transversal de la población con diagnóstico de cáncer colorrectal atendida entre agosto 2012 y diciembre 2014 que corresponde al tiempo de funcionamiento del servicio de Coloproctología. Resultados: Se atendieron un total de 152 pacientes con cáncer colorrectal en la institución. Se operó el 91% de los pacientes. El estadío más frecuente fue el IV. Solo el 4.9% presentó dehiscencia de anastomosis, datos concordantes con la literatura cuando el manejo es a cargo de expertos. El subtipo histológico más frecuente fue adenocarcinoma moderadamente diferenciado y la mortalidad perioperatoria de 2.63%. Discusión: El cáncer colorrectal es una entidad con alta morbimortalidad lo cual puede cambiar si se realizan pruebas de tamizaje, para realizar un manejo temprano y oportuno. Además juega un papel importante la experiencia del cirujano y la discusión de los pacientes en juntas multidisciplinarias. Palabras clave: cáncer de colon, cáncer de recto, epidemiología, estadificación
Resumo:
Es una lectura esencial para los profesionales del aprendizaje de adultos, estudiantes y profesionales de los recursos humanos, pues proporciona un marco teórico para la comprensión de los problemas del aprendizaje de adultos, tanto en los entornos docentes como en los lugares de trabajo. Se divide en tres partes: la primera parte estudia 'Las raíces de la andragogía', se traza el desarrollo de esta teoría y las características de los estudiantes adultos; en la segunda parte 'Los avances en el aprendizaje de adultos', se explican sus perspectivas de futuro en la investigación y la práctica y, en la última parte, 'Practica en el aprendizaje de adultos', se presentan lecturas seleccionadas que desarrollan los aspectos específicos de la andragogía en la práctica y que incluye estrategias para implementar los supuestos básicos, para adaptar el aprendizaje a las diferencias individuales, y para implementar la educación de adultos en las organizaciones. De especial interés son dos instrumentos de autoevaluación, 'the Core Competency Diagnostic and Planning Guide' y 'the Personal Adult Learning Style Inventory' que permiten al lector ponerse en camino del desarrollo personal en el aprendizaje de adultos.
Resumo:
Applications such as neuroscience, telecommunication, online social networking, transport and retail trading give rise to connectivity patterns that change over time. In this work, we address the resulting need for network models and computational algorithms that deal with dynamic links. We introduce a new class of evolving range-dependent random graphs that gives a tractable framework for modelling and simulation. We develop a spectral algorithm for calibrating a set of edge ranges from a sequence of network snapshots and give a proof of principle illustration on some neuroscience data. We also show how the model can be used computationally and analytically to investigate the scenario where an evolutionary process, such as an epidemic, takes place on an evolving network. This allows us to study the cumulative effect of two distinct types of dynamics.
Resumo:
The combination of radar and lidar in space offers the unique potential to retrieve vertical profiles of ice water content and particle size globally, and two algorithms developed recently claim to have overcome the principal difficulty with this approach-that of correcting the lidar signal for extinction. In this paper "blind tests" of these algorithms are carried out, using realistic 94-GHz radar and 355-nm lidar backscatter profiles simulated from aircraft-measured size spectra, and including the effects of molecular scattering, multiple scattering, and instrument noise. Radiation calculations are performed on the true and retrieved microphysical profiles to estimate the accuracy with which radiative flux profiles could be inferred remotely. It is found that the visible extinction profile can be retrieved independent of assumptions on the nature of the size distribution, the habit of the particles, the mean extinction-to-backscatter ratio, or errors in instrument calibration. Local errors in retrieved extinction can occur in proportion to local fluctuations in the extinction-to-backscatter ratio, but down to 400 m above the height of the lowest lidar return, optical depth is typically retrieved to better than 0.2. Retrieval uncertainties are greater at the far end of the profile, and errors in total optical depth can exceed 1, which changes the shortwave radiative effect of the cloud by around 20%. Longwave fluxes are much less sensitive to errors in total optical depth, and may generally be calculated to better than 2 W m(-2) throughout the profile. It is important for retrieval algorithms to account for the effects of lidar multiple scattering, because if this is neglected, then optical depth is underestimated by approximately 35%, resulting in cloud radiative effects being underestimated by around 30% in the shortwave and 15% in the longwave. Unlike the extinction coefficient, the inferred ice water content and particle size can vary by 30%, depending on the assumed mass-size relationship (a problem common to all remote retrieval algorithms). However, radiative fluxes are almost completely determined by the extinction profile, and if this is correct, then errors in these other parameters have only a small effect in the shortwave (around 6%, compared to that of clear sky) and a negligible effect in the longwave.
Resumo:
Midlatitude cyclones are important contributors to boundary layer ventilation. However, it is uncertain how efficient such systems are at transporting pollutants out of the boundary layer, and variations between cyclones are unexplained. In this study 15 idealized baroclinic life cycles, with a passive tracer included, are simulated to identify the relative importance of two transport processes: horizontal divergence and convergence within the boundary layer and large-scale advection by the warm conveyor belt. Results show that the amount of ventilation is insensitive to surface drag over a realistic range of values. This indicates that although boundary layer processes are necessary for ventilation they do not control the magnitude of ventilation. A diagnostic for the mass flux out of the boundary layer has been developed to identify the synoptic-scale variables controlling the strength of ascent in the warm conveyor belt. A very high level of correlation (R-2 values exceeding 0.98) is found between the diagnostic and the actual mass flux computed from the simulations. This demonstrates that the large-scale dynamics control the amount of ventilation, and the efficiency of midlatitude cyclones to ventilate the boundary layer can be estimated using the new mass flux diagnostic. We conclude that meteorological analyses, such as ERA-40, are sufficient to quantify boundary layer ventilation by the large-scale dynamics.
Resumo:
With the latest advances in the area of advanced computer architectures we are seeing already large scale machines at petascale level and we are discussing exascale computing. All these require efficient scalable algorithms in order to bridge the performance gap. In this paper examples of various approaches of designing scalable algorithms for such advanced architectures will be given and the corresponding properties of these algorithms will be outlined and discussed. Examples will outline such scalable algorithms applied to large scale problems in the area Computational Biology, Environmental Modelling etc. The key properties of such advanced and scalable algorithms will be outlined.
Resumo:
This paper formally derives a new path-based neural branch prediction algorithm (FPP) into blocks of size two for a lower hardware solution while maintaining similar input-output characteristic to the algorithm. The blocked solution, here referred to as B2P algorithm, is obtained using graph theory and retiming methods. Verification approaches were exercised to show that prediction performances obtained from the FPP and B2P algorithms differ within one mis-prediction per thousand instructions using a known framework for branch prediction evaluation. For a chosen FPGA device, circuits generated from the B2P algorithm showed average area savings of over 25% against circuits for the FPP algorithm with similar time performances thus making the proposed blocked predictor superior from a practical viewpoint.
Resumo:
An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.
Resumo:
Two so-called “integrated” polarimetric rate estimation techniques, ZPHI (Testud et al., 2000) and ZZDR (Illingworth and Thompson, 2005), are evaluated using 12 episodes of the year 2005 observed by the French C-band operational Trappes radar, located near Paris. The term “integrated” means that the concentration parameter of the drop size distribution is assumed to be constant over some area and the algorithms retrieve it using the polarimetric variables in that area. The evaluation is carried out in ideal conditions (no partial beam blocking, no ground-clutter contamination, no bright band contamination, a posteriori calibration of the radar variables ZH and ZDR) using hourly rain gauges located at distances less than 60 km from the radar. Also included in the comparison, for the sake of benchmarking, is a conventional Z = 282R1.66 estimator, with and without attenuation correction and with and without adjustment by rain gauges as currently done operationally at Météo France. Under those ideal conditions, the two polarimetric algorithms, which rely solely on radar data, appear to perform as well if not better, pending on the measurements conditions (attenuation, rain rates, …), than the conventional algorithms, even when the latter take into account rain gauges through the adjustment scheme. ZZDR with attenuation correction is the best estimator for hourly rain gauge accumulations lower than 5 mm h−1 and ZPHI is the best one above that threshold. A perturbation analysis has been conducted to assess the sensitivity of the various estimators with respect to biases on ZH and ZDR, taking into account the typical accuracy and stability that can be reasonably achieved with modern operational radars these days (1 dB on ZH and 0.2 dB on ZDR). A +1 dB positive bias on ZH (radar too hot) results in a +14% overestimation of the rain rate with the conventional estimator used in this study (Z = 282R^1.66), a -19% underestimation with ZPHI and a +23% overestimation with ZZDR. Additionally, a +0.2 dB positive bias on ZDR results in a typical rain rate under- estimation of 15% by ZZDR.