880 resultados para applied forces to the spine
Resumo:
Acquired brain injury (ABI) is one of the leading causes of death and disability in the world and is associated with high health care costs as a result of the acute treatment and long term rehabilitation involved. Different algorithms and methods have been proposed to predict the effectiveness of rehabilitation programs. In general, research has focused on predicting the overall improvement of patients with ABI. The purpose of this study is the novel application of data mining (DM) techniques to predict the outcomes of cognitive rehabilitation in patients with ABI. We generate three predictive models that allow us to obtain new knowledge to evaluate and improve the effectiveness of the cognitive rehabilitation process. Decision tree (DT), multilayer perceptron (MLP) and general regression neural network (GRNN) have been used to construct the prediction models. 10-fold cross validation was carried out in order to test the algorithms, using the Institut Guttmann Neurorehabilitation Hospital (IG) patients database. Performance of the models was tested through specificity, sensitivity and accuracy analysis and confusion matrix analysis. The experimental results obtained by DT are clearly superior with a prediction average accuracy of 90.38%, while MLP and GRRN obtained a 78.7% and 75.96%, respectively. This study allows to increase the knowledge about the contributing factors of an ABI patient recovery and to estimate treatment efficacy in individual patients.
Resumo:
The understanding of the circulation of ocean currents, the exchange of CO2 between atmosphere and oceans, and the in uence of the oceans on the distribution of heat on a global scale is key to our ability to predict and assess the future evolution of climate.
Resumo:
A global Lagrangian descriptor applied to the Kuroshio current
Resumo:
Airbus designs and industrializes aircrafts using Concurrent Engineering techniques since decades. The introduction of new PLM methods, procedures and tools, and the need to reduce time-to-market, led Airbus Military to pursue new working methods. Traditional Engineering works sequentially. Concurrent Engineering basically overlaps tasks between teams. Collaborative Engineering promotes teamwork to develop product, processes and resources from the conceptual phase to the start of the serial production. The CALIPSO-neo pilot project was launched to support the industrialization process of a medium size aerostructure. The aim is to implement the industrial Digital Mock-Up (iDMU) concept and its exploitation to create shop floor documentation. In a framework of a collaborative engineering strategy, the project is part of the efforts to deploy Digital Manufacturing as a key technology for the industrialization of aircraft assembly lines. This paper presents the context, the conceptual approach and the methodology adopted.
Resumo:
Several attempts have been carried out to manufacture intermediate band solar cells (IBSC) by means of quantum dot (QD) superlattices. This novel photovoltaic concept allows the collection of a wider range of the sunlight spectrum in order to provide higher cell photocurrent while maintaining the open-circuit voltage (VOC) of the cell. In this work, we analyze InAs/GaAsN QD-IBSCs. In these cells, the dilute nitrogen in the barrier plays an important role for the strain-balance (SB) of the QD layer region that would otherwise create dislocations under the effect of the accumulated strain. The introduction of GaAsN SB layers allows increasing the light absorption in the QD region by multi-stacking more than 100 QD layers. The photo-generated current density (JL) versus VOC was measured under varied concentrated light intensity and temperature. We found that the VOC of the cell at 20 K is limited by the bandgap of the GaAsN barriers, which has important consequences regarding IBSC bandgap engineering that are also discussed in this work.
Resumo:
An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values of dust re-suspended during a LOVA.
Resumo:
El desarrollo de las técnicas de imágenes por resonancia magnética han permitido el estudio y cuantificación, in vivo, de los cambios que ocurren en la morfología cerebral ligados a procesos tales como el neurodesarrollo, el envejecimiento, el aprendizaje o la enfermedad. Un gran número de métodos de morfometría han sido desarrollados con el fin de extraer la información contenida en estas imágenes y traducirla en indicadores de forma o tamaño, tales como el volumen o el grosor cortical; marcadores que son posteriormente empleados para encontrar diferencias estadísticas entre poblaciones de sujetos o realizar correlaciones entre la morfología cerebral y, por ejemplo, la edad o la severidad de determinada enfermedad. A pesar de la amplia variedad de biomarcadores y metodologías de morfometría, muchos estudios sesgan sus hipótesis, y con ello los resultados experimentales, al empleo de un número reducido de biomarcadores o a al uso de una única metodología de procesamiento. Con el presente trabajo se pretende demostrar la importancia del empleo de diversos métodos de morfometría para lograr una mejor caracterización del proceso que se desea estudiar. En el mismo se emplea el análisis de forma para detectar diferencias, tanto globales como locales, en la morfología del tálamo entre pacientes adolescentes con episodios tempranos de psicosis y adolescentes sanos. Los resultados obtenidos demuestran que la diferencia de volumen talámico entre ambas poblaciones de sujetos, previamente descrita en la literatura, se debe a una reducción del volumen de la región anterior-mediodorsal y del núcleo pulvinar del tálamo de los pacientes respecto a los sujetos sanos. Además, se describe el desarrollo de un estudio longitudinal, en sujetos sanos, que emplea simultáneamente distintos biomarcadores para la caracterización y cuantificación de los cambios que ocurren en la morfología de la corteza cerebral durante la adolescencia. A través de este estudio se revela que el proceso de “alisado” que experimenta la corteza cerebral durante la adolescencia es consecuencia de una disminución de la profundidad, ligada a un incremento en el ancho, de los surcos corticales. Finalmente, esta metodología es aplicada, en un diseño transversal, para el estudio de las causas que provocan el decrecimiento tanto del grosor cortical como del índice de girificación en adolescentes con episodios tempranos de psicosis. ABSTRACT The ever evolving sophistication of magnetic resonance image techniques continue to provide new tools to characterize and quantify, in vivo, brain morphologic changes related to neurodevelopment, senescence, learning or disease. The majority of morphometric methods extract shape or size descriptors such as volume, surface area, and cortical thickness from the MRI image. These morphological measurements are commonly entered in statistical analytic approaches for testing between-group differences or for correlations between the morphological measurement and other variables such as age, sex, or disease severity. A wide variety of morphological biomarkers are reported in the literature. Despite this wide range of potentially useful biomarkers and available morphometric methods, the hypotheses and findings of the grand majority of morphological studies are biased because reports assess only one morphometric feature and usually use only one image processing method. Throughout this dissertation biomarkers and image processing strategies are combined to provide innovative and useful morphometric tools for examining brain changes during neurodevelopment. Specifically, a shape analysis technique allowing for a fine-grained assessment of regional thalamic volume in early-onset psychosis patients and healthy comparison subjects is implemented. Results show that disease-related reductions in global thalamic volume, as previously described by other authors, could be particularly driven by a deficit in the anterior-mediodorsal and pulvinar thalamic regions in patients relative to healthy subjects. Furthermore, in healthy adolescents different cortical features are extracted and combined and their interdependency is assessed over time. This study attempts to extend current knowledge of normal brain development, specifically the largely unexplored relationship between changes of distinct cortical morphological measurements during adolescence. This study demonstrates that cortical flattening, present during adolescence, is produced by a combination of age-related increase in sulcal width and decrease in sulcal depth. Finally, this methodology is applied to a cross-sectional study, investigating the mechanisms underlying the decrease in cortical thickness and gyrification observed in psychotic patients with a disease onset during adolescence.
Resumo:
Chemical process accidents still occur and cost billions of dollars and, what is worse, many human lives. That means that traditional hazard analysis techniques are not enough mainly owing to the increase of complexity and size of chemical plants. In the last years, a new hazard analysis technique has been developed, changing the focus from reliability to system theory and showing promising results in other industries such as aeronautical and nuclear. In this paper, we present an approach for the application of STAMP and STPA analysis developed by Leveson in 2011 to the process industry.
Resumo:
The optimal design of a vertical cantilever beam is presented in this paper. The beam is assumed immersed in an elastic Winkler soil and subjected to several loads: a point force at the tip section, its self weight and a uniform distributed load along its length. lbe optimal design problem is to find the beam of a given length and minimum volume, such that the resultant compressive stresses are admisible. This prohlem is analyzed according to linear elasticity theory and within different alternative structural models: column, Navier-Bernoulli beam-column, Timoshenko beamcolumn (i.e. with shear strain) under conservative loads, typically, constant direction loads. Results obtained in each case are compared, in order to evaluate the sensitivity of model on the numerical results. The beam optimal design is described by the section distribution layout (area, second moment, shear area etc.) along the beam span and the corresponding beam total volume. Other situations, some of them very interesting from a theoretical point of view, with follower loads (Beck and Leipholz problems) are also discussed, leaving for future work numerical details and results.
Resumo:
Microarrays can measure the expression of thousands of genes to identify changes in expression between different biological states. Methods are needed to determine the significance of these changes while accounting for the enormous number of genes. We describe a method, Significance Analysis of Microarrays (SAM), that assigns a score to each gene on the basis of change in gene expression relative to the standard deviation of repeated measurements. For genes with scores greater than an adjustable threshold, SAM uses permutations of the repeated measurements to estimate the percentage of genes identified by chance, the false discovery rate (FDR). When the transcriptional response of human cells to ionizing radiation was measured by microarrays, SAM identified 34 genes that changed at least 1.5-fold with an estimated FDR of 12%, compared with FDRs of 60 and 84% by using conventional methods of analysis. Of the 34 genes, 19 were involved in cell cycle regulation and 3 in apoptosis. Surprisingly, four nucleotide excision repair genes were induced, suggesting that this repair pathway for UV-damaged DNA might play a previously unrecognized role in repairing DNA damaged by ionizing radiation.
Resumo:
The Answer Validation Exercise (AVE) is a pilot track within the Cross-Language Evaluation Forum (CLEF) 2006. The AVE competition provides an evaluation frame- work for answer validations in Question Answering (QA). In our participation in AVE, we propose a system that has been initially used for other task as Recognising Textual Entailment (RTE). The aim of our participation is to evaluate the improvement our system brings to QA. Moreover, due to the fact that these two task (AVE and RTE) have the same main idea, which is to find semantic implications between two fragments of text, our system has been able to be directly applied to the AVE competition. Our system is based on the representation of the texts by means of logic forms and the computation of semantic comparison between them. This comparison is carried out using two different approaches. The first one managed by a deeper study of the Word- Net relations, and the second uses the measure defined by Lin in order to compute the semantic similarity between the logic form predicates. Moreover, we have also designed a voting strategy between our system and the MLEnt system, also presented by the University of Alicante, with the aim of obtaining a joint execution of the two systems developed at the University of Alicante. Although the results obtained have not been very high, we consider that they are quite promising and this supports the fact that there is still a lot of work on researching in any kind of textual entailment.
Resumo:
Various studies indicate that most of the slope instabilities affecting Flysch heterogeneous rock masses are related to differential weathering of the lithologies that make up the slope. Therefore, the weathering characteristics of the intact rock are of great importance for the study of these types of slopes and their associated instability processes. The main aim of this study is to characterise the weathering properties of the different lithologies outcropping in the carbonatic Flysch of Alicante (Spain), in order to understand the effects of environmental weathering on them, following slope excavation. To this end, 151 strata samples obtained from 11 different slopes, 5–40 years old, were studied. The lithologies were identified and their mechanical characteristics obtained using field and laboratory tests. Additionally, the slaking properties of intact rocks were determined, and a classification system proposed based on the first and fifth slake cycles (Id1 and Id5 respectively) and an Index of Weathering (IW5), defined in the study. Information obtained from the laboratory and the field was used to characterise the weathering behaviour of the rocks. Furthermore, the slaking properties determined from laboratory tests were related to the in-situ weathering properties of rocks (i.e., the weathering profile, patterns and length, and weathering rate). The proposed relationship between laboratory test results, field data, and in-situ observations provides a useful tool for predicting the response of slopes to weathering after excavation during the preliminary stages of design.
Resumo:
Mathematical models used for the understanding of coastal seabed morphology play a key role in beach nourishment projects. These projects have become the fundamental strategy for coastal maintenance during the last few years. Accordingly, the accuracy of these models is vital to optimize the costs of coastal regeneration projects. Planning of such interventions requires methodologies that do not generate uncertainties in their interpretation. A study and comparison of mathematical simulation models of the coastline is carried out in this paper, as well as elements that are part of the model that are a source of uncertainty. The equilibrium profile (EP) and the offshore limit corresponding to the depth of closure (DoC) have been analyzed taking into account different timescale ranges. The results have thus been compared using data sets from three different periods which are identified as present, past and future. Accuracy in data collection for the beach profiles and the definition of the median grain size calculation using collected samples are the two main factors that have been taken into account in this paper. These data can generate high uncertainties and can produce a lack of accuracy in nourishment projects. Together they can generate excessive costs due to possible excess or shortage of sand used for the nourishment. The main goal of this paper is the development of a new methodology to increase the accuracy of the existing equilibrium beach profile models, providing an improvement to the inputs used in such models and in the fitting of the formulae used to obtain seabed shape. This new methodology has been applied and tested on Valencia's beaches.