908 resultados para NUCLEOL (Computer program language)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spasticity is a common disorder in people who have upper motor neuron injury. The involvement may occur at different levels. The Modified Ashworth Scale (MAS) is the most used method to measure involvement levels. But it corresponds to a subjective evaluation. Mechanomyography (MMG) is an objective technique that quantifies the muscle vibration during the contraction and stretching events. So, it may assess the level of spasticity accurately. This study aimed to investigate the correlation between spasticity levels determined by MAS with MMG signal in spastic and not spastic muscles. In the experimental protocol, we evaluated 34 members of 22 volunteers, of both genders, with a mean age of 39.91 ± 13.77 years. We evaluated the levels of spasticity by MAS in flexor and extensor muscle groups of the knee and/or elbow, where one muscle group was the agonist and one antagonist. Simultaneously the assessment by the MAS, caught up the MMG signals. We used a custom MMG equipment to register and record the signals, configured in LabView platform. Using the MatLab computer program, it was processed the MMG signals in the time domain (median energy) and spectral domain (median frequency) for the three motion axes: X (transversal), Y (longitudinal) and Z (perpendicular). For bandwidth delimitation, we used a 3rd order Butterworth filter, acting in the range of 5-50 Hz. Statistical tests as Spearman's correlation coefficient, Kruskal-Wallis test and linear correlation test were applied. As results in the time domain, the Kruskal-Wallis test showed differences in median energy (MMGME) between MAS groups. The linear correlation test showed high linear correlation between MAS and MMGME for the agonist muscle as well as for the antagonist group. The largest linear correlation occurred between the MAS and MMG ME for the Z axis of the agonist muscle group (R2 = 0.9557) and the lowest correlation occurred in the X axis, for the antagonist muscle group (R2 = 0.8862). The Spearman correlation test also confirmed high correlation for all axes in the time domain analysis. In the spectral domain, the analysis showed an increase in the median frequency (MMGMF) in MAS’ greater levels. The highest correlation coefficient between MAS and MMGMF signal occurred in the Z axis for the agonist muscle group (R2 = 0.4883), and the lowest value occurred on the Y axis for the antagonist group (R2 = 0.1657). By means of the Spearman correlation test, the highest correlation occurred between the Y axis of the agonist group (0.6951; p <0.001) and the lowest value on the X axis of the antagonist group (0.3592; p <0.001). We conclude that there was a significantly high correlation between the MMGME and MAS in both muscle groups. Also between MMG and MAS occurred a significant correlation, however moderate for the agonist group, and low for the antagonist group. So, the MMGME proved to be more an appropriate descriptor to correlate with the degree of spasticity defined by the MAS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As rápidas alterações sociais, económicas, culturais e ambientais determinaram mudanças significativas nos estilos de vida e contribuíram para o crescimento e generalização do consumo de alimentos e refeições fora de casa. Portugal acompanha a tendência de aumento do consumo alimentar fora de casa, assim, as refeições fora de casa, que há uns anos eram um acontecimento fortuito, são hoje uma prática habitual das famílias portuguesas, não só durante a semana de trabalho, mas também nos fins-de-semana. As, visitas aos centros comerciais que se tornaram um hábito no nosso país incluem uma paragem nas Praças de Alimentação, espaços de excelência pela diversidade alimentar onde predominam as refeições de fast-food. Porém é fundamental a escolha adequada/equilibrada dos alimentos que se vão consumir. O presente trabalho procurou avaliar os hábitos e percepção dos consumidores de refeições rápidas com base numa ementa específica cujo alimento principal é o pão. Posteriormente e de acordo com as preferências de consumo procedeu-se à avaliação nutricional das escolhas. Neste estudo participaram 150 indivíduos que frequentaram as instalações de um restaurante de comida rápida situada na praça de alimentação de um centro comercial situado em Viseu. Foi aplicado um questionário de autopreenchimento, por nós elaborado dividido em 4 partes: caracterização sociodemográfica; hábitos de consumo dos inquiridos; produtos escolhidos pelos inquiridos; grau de satisfação face aos produtos escolhidos. As análises estatísticas foram efectuadas com recurso ao Programa informático Statistical Package for the Social Sciences - SPSS® for Windows, versão 22. Realizam-se testes de Qui-quadrado com simulação de Monte Carlo, considerando o nível de significância de 0,05. Com base nas escolhas mais frequentes feitas pelos inquiridos procedeu-se à avaliação nutricional dos menus recorrendo ao programa DIAL 1.19 versão 1 e quando não se encontrou informação neste utilizou-se a tabela de composição de alimentos portugueses on line (INSA, 2010). Compararam-se os valores obtidos para o Valor Calórico Total, os macronutrientes, a fibra, o colesterol e o sódio com as Doses Diárias Recomendadas. A amostra era composta por 68,7% mulheres e 31,3% homens, com uma média de idades de 29,9 ± 3 anos e, maioritariamente empregados (64,7%). O grau de instrução da maioria dos inquiridos (54,7%) era o ensino superior. Grande parte da amostra não se considera consumidora habitual de fast-food,referindo ainda efectuar frequentemente uma alimentação equilibrada. Sendo que apenas 5 % frequenta as instalações mais de uma vez por semana. De entre os produtos disponíveis, a preferência fez-se pela sandes e batata-frita, sendo o momento de maior consumo o almoçoA avaliação nutricional das escolhas preferenciais dos inquiridos mostrou que o VCT do menu que inclui água como bebida está dentro dos limites calóricos preconizados para o almoço excepção feita ao menu que inclui sandes quente de frango em pão de orégãos e sandes fria de queijo fresco que se destacam por apresentar um valor inferior ao limite mínimo recomendado. Pelo contrário, a inclusão no menu do refrigerante faz com que haja um aumento do VCT, independentemente da sandes considerada, em 18%. Uma análise detalhada mostra que estas ementas são desequilibradas, apresentando 33,3% delas valores de proteínas superiores à DDR enquanto que os valores de HC e lípidos se encontram maioritariamente dentro dos limites havendo apenas 13,3% das ementas fora desses valores. Relativamente ao aporte de fibra e de sódio 86,7% das ementas aparecem desenquadradas com valores excessivos de sódio e valores de fibra 33% abaixo do limite mínimo recomendado. Tratando-se de um estudo de caso em que apenas se inclui um único restaurante de uma praça de alimentação, que fornece ementas à base de pão (sandes) os resultados são interpretados de forma cautelosa e sem generalização. Podemos no entanto concluir, face aos resultados obtidos a necessidade de redução do teor de sal das ementas. Para além disso parece-nos fundamental, para que o consumidor possa comparar opções alimentares e tomar decisões informadas, a disponibilização da informação nutricional das ementas propostas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se aborda la aplicación de SPEA2, un método para optimización multiobjetivo, al cálculo de un esquema de dosificación para el tratamiento quimioterapéutico de una masa tumoral; entiéndase por esquema de dosificación la especificación del o de los agentes cito-tóxicos, sus dosis y tiempos en que deben administrarse. El problema de optimización aquí resuelto es uno multiobjetivo, pues el esquema de dosificación a calcularse debe minimizar no solo el tamaño del tumor, sino también la toxicidad remanente al término del tratamiento, su costo, etc. El SPEA2 es un algoritmo genético que aplica el criterio de Pareto; por lo tanto, lo que calcula es una aproximación a la frontera de Pareto, soluciones de entre las cuales el usuario puede escoger la “mejor”. En el proceso de esta investigación se construyó SoT-Q, una herramienta de software que consiste de dos módulos principales: un optimizador para calcular los esquemas de dosificación óptimos, y un simulador para aplicar dichos esquemas a un paciente (simulado) con masa tumoral; el funcionamiento del simulador se basa en un modelo fármaco-dinámico que representa el tumor. El programa SoT-Q podría en el futuro -una vez extensamente probado y depurado- asistir a médicos oncólogos en la toma de decisiones respecto a tratamientos quimioterapéuticos; o podría servir también como ayuda pedagógica en el entrenamiento de nuevos profesionales de la salud. Los resultados obtenidos fueron muy buenos; en todos los casos de prueba utilizados se logró reducir de manera significativa tanto el tamaño del tumor como la toxicidad remanente al término del tratamiento; en algunos casos la reducción fue de tres órdenes de magnitud.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente trabajo empleó herramientas de hardware y software de licencia libre para el establecimiento de una estación base celular (BTS) de bajo costo y fácil implementación. Partiendo de conceptos técnicos que facilitan la instalación del sistema OpenBTS y empleando el hardware USRP N210 (Universal Software Radio Peripheral) permitieron desplegar una red análoga al estándar de telefonía móvil (GSM). Usando los teléfonos móviles como extensiones SIP (Session Initiation Protocol) desde Asterisk, logrando ejecutar llamadas entre los terminales, mensajes de texto (SMS), llamadas desde un terminal OpenBTS hacia otra operadora móvil, entre otros servicios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado, Qualidade em Análises, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Na engenharia mecânica há cada vez mais necessidade de utilizar e de prever o comportamento das máquinas térmicas, mais propriamente dos motores de combustão interna, em especial na área da manutenção e da prevenção de falha num dos componentes vitais de um motor a 4 tempos: o veio de manivelas. Esta situação já tem sido bastante observada na indústria mecânica naval, nomeadamente na Marinha Portuguesa e, devido ao seu elevado grau de importância no desempenho de qualquer motor, decidiu-se focar o trabalho desta tese no estudo dos motores a diesel S.E.M.T Pielstick das unidades navais da Marinha Portuguesa, mais especificamente das corvetas da classe “João Coutinho” e da classe “ Baptista de Andrade”, devido ao historial de ocorrência de falhas no veio de manivelas nesta classe de navios e em outras da Marinha Portuguesa. Para efetuar este estudo, utilizaram-se todos os dados relativos ao historial de ocorrências de falhas destes motores, bem como todos os dados disponíveis do fabricante destes motores, por forma a reproduzir da forma mais fiável possível um modelo tridimensional do veio de manivelas no programa de modelação informática CAD Solidworks®, e possibilitar a análise cinemática do veio de manivelas. Desta forma, foi possível simular as condições de funcionamento do motor, assim como analisar e determinar a causa de falha do veio de manivelas, visando prolongar a vida útil dos veios de manivelas, contribuindo não só para menores custos de manutenção mas também para o aumento da operacionalidade destes navios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sequence problems belong to the most challenging interdisciplinary topics of the actuality. They are ubiquitous in science and daily life and occur, for example, in form of DNA sequences encoding all information of an organism, as a text (natural or formal) or in form of a computer program. Therefore, sequence problems occur in many variations in computational biology (drug development), coding theory, data compression, quantitative and computational linguistics (e.g. machine translation). In recent years appeared some proposals to formulate sequence problems like the closest string problem (CSP) and the farthest string problem (FSP) as an Integer Linear Programming Problem (ILPP). In the present talk we present a general novel approach to reduce the size of the ILPP by grouping isomorphous columns of the string matrix together. The approach is of practical use, since the solution of sequence problems is very time consuming, in particular when the sequences are long.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this research is to synthesize structural composites designed with particular areas defined with custom modulus, strength and toughness values in order to improve the overall mechanical behavior of the composite. Such composites are defined and referred to as 3D-designer composites. These composites will be formed from liquid crystalline polymers and carbon nanotubes. The fabrication process is a variation of rapid prototyping process, which is a layered, additive-manufacturing approach. Composites formed using this process can be custom designed by apt modeling methods for superior performance in advanced applications. The focus of this research is on enhancement of Young's modulus in order to make the final composite stiffer. Strength and toughness of the final composite with respect to various applications is also discussed. We have taken into consideration the mechanical properties of final composite at different fiber volume content as well as at different orientations and lengths of the fibers. The orientation of the LC monomers is supposed to be carried out using electric or magnetic fields. A computer program is modeled incorporating the Mori-Tanaka modeling scheme to generate the stiffness matrix of the final composite. The final properties are then deduced from the stiffness matrix using composite micromechanics. Eshelby's tensor, required to calculate the stiffness tensor using Mori-Tanaka method, is calculated using a numerical scheme that determines the components of the Eshelby's tensor (Gavazzi and Lagoudas 1990). The numerical integration is solved using Gaussian Quadrature scheme and is worked out using MATLAB as well. . MATLAB provides a good deal of commands and algorithms that can be used efficiently to elaborate the continuum of the formula to its extents. Graphs are plotted using different combinations of results and parameters involved in finding these results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Universal newborn hearing screening was implemented worldwide largely on modeled, not measured, long-term benefits. Comparative quantification of population benefits would justify its high cost.

METHODS: Natural experiment comparing 3 population approaches to detecting bilateral congenital hearing loss (>25 dB, better ear) in Australian states with similar demographics and services: (1) universal newborn hearing screening, New South Wales 2003-2005, n = 69; (2) Risk factor screening (neonatal intensive care screening + universal risk factor referral), Victoria 2003-2005, n = 65; and (3) largely opportunistic detection, Victoria 1991-1993, n = 86. Children in (1) and (2) were followed at age 5 to 6 years and in (3) at 7 to 8 years. Outcomes were compared between states using adjusted linear regression.

RESULTS: Children were diagnosed younger with universal than risk factor screening (adjusted mean difference -8.0 months, 95% confidence interval -12.3 to -3.7). For children without intellectual disability, moving from opportunistic to risk factor to universal screening incrementally improved age of diagnosis (22.5 vs 16.2 vs 8.1 months, P < .001), receptive (81.8 vs 83.0 vs 88.9, P = .05) and expressive (74.9 vs 80.7 vs 89.3, P < .001) language and receptive vocabulary (79.4 vs 83.8 vs 91.5, P < .001); these nonetheless remained well short of cognition (mean 103.4, SD 15.2). Behavior and health-related quality of life were unaffected.

CONCLUSIONS: With new randomized trials unlikely, this may represent the most definitive population-based evidence supporting universal newborn hearing screening. Although outperforming risk factor screening, school entry language still lagged cognitive abilities by nearly a SD. Prompt intervention and efficacy research are needed for children to reach their potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Knowing the experience of abuse, contextual determinants that led to the rupture of the situation and attempts to build a more harmonious future, it is essential to work sensitivities and better understand victims of domestic violence. Objectives: To understand the suffering of women victims of violence. Methods: This is an intentional sample of 21 women who were at shelter home or in the community. The data were collected by in- Documento descargado de http://www.elsevier.es el 13-10-2016 3rd World Congress of Health Research 21 terviews, guided by a script organized into four themes. The interviews were conducted with audio record, the permission of the participants were fully passed the text and analyzed as two different corpuses, depending on the context in which they occurred. The analysis was conducted using the ALCESTE computer program. The study obtained a favorable opinion of the Committee on Health and Welfare of the University of Évora. Results: From the irst sample analysis emerged ive classes. The association of the words gave the meaning of each class that we have appointed as Class 1 - Precipitating Events; Class 2 - Experience of abuse; Class 3 - Two feet in the present and looking into the future; Class 4 - The present and learning from the experience of abuse; and Class 5 - Violence in general. From the analysis of the sample in the community four classes emerged that we have appointed as Class 1 - Violence in general; Class 2 - Precipitating Events; Class 3 - abuse of experience; and class 4 - Support in the process. Conclusions: Women who are at shelter home have this experience of violence and its entire context a lot are very focused on their experiences and the future is distant and unclear. Women in the community have a more comprehensive view of the phenomenon of violence as a whole, they can decentralize to their personal experiences and recognize the importance of support in the future construction process.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The CIL compiler for core Standard ML compiles whole programs using a novel typed intermediate language (TIL) with intersection and union types and flow labels on both terms and types. The CIL term representation duplicates portions of the program where intersection types are introduced and union types are eliminated. This duplication makes it easier to represent type information and to introduce customized data representations. However, duplication incurs compile-time space costs that are potentially much greater than are incurred in TILs employing type-level abstraction or quantification. In this paper, we present empirical data on the compile-time space costs of using CIL as an intermediate language. The data shows that these costs can be made tractable by using sufficiently fine-grained flow analyses together with standard hash-consing techniques. The data also suggests that non-duplicating formulations of intersection (and union) types would not achieve significantly better space complexity.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The effects oftwo types of small-group communication, synchronous computer-mediated and face-to-face, on the quantity and quality of verbal output were con^ared. Quantity was deiSned as the number of turns taken per minute, the number of Analysis-of-Speech units (AS-units) produced per minute, and the number ofwords produced per minute. Quality was defined as the number of words produced per AS-unit. In addition, the interaction of gender and type of communication was explored for any differences that existed in the output produced. Questionnaires were also given to participants to determine attitudes toward computer-mediated and face-to-face communication. Thirty intermediate-level students fi-om the Intensive English Language Program (lELP) at Brock University participated in the study, including 15 females and 15 males. Nonparametric tests, including the Wilcoxon matched-pairs test, Mann-Whitney U test, and Friedman test were used to test for significance at the p < .05 level. No significant differences were found in the effects of computer-mediated and face-to-face communication on the output produced during follow-up speaking sessions. However, the quantity and quality of interaction was significantly higher during face-to-face sessions than computer-mediated sessions. No significant differences were found in the output produced by males and females in these 2 conditions. While participants felt that the use of computer-mediated communication may aid in the development of certain language skills, they generally preferred face-to-face communication. These results differed fi-om previous studies that found a greater quantity and quality of output in addition to a greater equality of interaction produced during computer-mediated sessions in comparison to face-to-face sessions (Kern, 1995; Warschauer, 1996).

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Tensor3D is a geometric modeling program with the capacity to simulate and visualize in real-time the deformation, specified through a tensor matrix and applied to triangulated models representing geological bodies. 3D visualization allows the study of deformational processes that are traditionally conducted in 2D, such as simple and pure shears. Besides geometric objects that are immediately available in the program window, the program can read other models from disk, thus being able to import objects created with different open-source or proprietary programs. A strain ellipsoid and a bounding box are simultaneously shown and instantly deformed with the main object. The principal axes of strain are visualized as well to provide graphical information about the orientation of the tensor's normal components. The deformed models can also be saved, retrieved later and deformed again, in order to study different steps of progressive strain, or to make this data available to other programs. The shape of stress ellipsoids and the corresponding Mohr circles defined by any stress tensor can also be represented. The application was written using the Visualization ToolKit, a powerful scientific visualization library in the public domain. This development choice, allied to the use of the Tcl/Tk programming language, which is independent on the host computational platform, makes the program a useful tool for the study of geometric deformations directly in three dimensions in teaching as well as research activities. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Field-Programmable Gate Arrays (FPGAs) are becoming increasingly important in embedded and high-performance computing systems. They allow performance levels close to the ones obtained with Application-Specific Integrated Circuits, while still keeping design and implementation flexibility. However, to efficiently program FPGAs, one needs the expertise of hardware developers in order to master hardware description languages (HDLs) such as VHDL or Verilog. Attempts to furnish a high-level compilation flow (e.g., from C programs) still have to address open issues before broader efficient results can be obtained. Bearing in mind an FPGA available resources, it has been developed LALP (Language for Aggressive Loop Pipelining), a novel language to program FPGA-based accelerators, and its compilation framework, including mapping capabilities. The main ideas behind LALP are to provide a higher abstraction level than HDLs, to exploit the intrinsic parallelism of hardware resources, and to allow the programmer to control execution stages whenever the compiler techniques are unable to generate efficient implementations. Those features are particularly useful to implement loop pipelining, a well regarded technique used to accelerate computations in several application domains. This paper describes LALP, and shows how it can be used to achieve high-performance computing solutions.