899 resultados para Data distribution


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background Regardless the regulatory function of microRNAs (miRNA), their differential expression pattern has been used to define miRNA signatures and to disclose disease biomarkers. To address the question of whether patients presenting the different types of diabetes mellitus could be distinguished on the basis of their miRNA and mRNA expression profiling, we obtained peripheral blood mononuclear cell (PBMC) RNAs from 7 type 1 (T1D), 7 type 2 (T2D), and 6 gestational diabetes (GDM) patients, which were hybridized to Agilent miRNA and mRNA microarrays. Data quantification and quality control were obtained using the Feature Extraction software, and data distribution was normalized using quantile function implemented in the Aroma light package. Differentially expressed miRNAs/mRNAs were identified using Rank products, comparing T1DxGDM, T2DxGDM and T1DxT2D. Hierarchical clustering was performed using the average linkage criterion with Pearson uncentered distance as metrics. Results The use of the same microarrays platform permitted the identification of sets of shared or specific miRNAs/mRNA interaction for each type of diabetes. Nine miRNAs (hsa-miR-126, hsa-miR-1307, hsa-miR-142-3p, hsa-miR-142-5p, hsa-miR-144, hsa-miR-199a-5p, hsa-miR-27a, hsa-miR-29b, and hsa-miR-342-3p) were shared among T1D, T2D and GDM, and additional specific miRNAs were identified for T1D (20 miRNAs), T2D (14) and GDM (19) patients. ROC curves allowed the identification of specific and relevant (greater AUC values) miRNAs for each type of diabetes, including: i) hsa-miR-1274a, hsa-miR-1274b and hsa-let-7f for T1D; ii) hsa-miR-222, hsa-miR-30e and hsa-miR-140-3p for T2D, and iii) hsa-miR-181a and hsa-miR-1268 for GDM. Many of these miRNAs targeted mRNAs associated with diabetes pathogenesis. Conclusions These results indicate that PBMC can be used as reporter cells to characterize the miRNA expression profiling disclosed by the different diabetes mellitus manifestations. Shared miRNAs may characterize diabetes as a metabolic and inflammatory disorder, whereas specific miRNAs may represent biological markers for each type of diabetes, deserving further attention.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many research fields are pushing the engineering of large-scale, mobile, and open systems towards the adoption of techniques inspired by self-organisation: pervasive computing, but also distributed artificial intelligence, multi-agent systems, social networks, peer-topeer and grid architectures exploit adaptive techniques to make global system properties emerge in spite of the unpredictability of interactions and behaviour. Such a trend is visible also in coordination models and languages, whenever a coordination infrastructure needs to cope with managing interactions in highly dynamic and unpredictable environments. As a consequence, self-organisation can be regarded as a feasible metaphor to define a radically new conceptual coordination framework. The resulting framework defines a novel coordination paradigm, called self-organising coordination, based on the idea of spreading coordination media over the network, and charge them with services to manage interactions based on local criteria, resulting in the emergence of desired and fruitful global coordination properties of the system. Features like topology, locality, time-reactiveness, and stochastic behaviour play a key role in both the definition of such a conceptual framework and the consequent development of self-organising coordination services. According to this framework, the thesis presents several self-organising coordination techniques developed during the PhD course, mainly concerning data distribution in tuplespace-based coordination systems. Some of these techniques have been also implemented in ReSpecT, a coordination language for tuple spaces, based on logic tuples and reactions to events occurring in a tuple space. In addition, the key role played by simulation and formal verification has been investigated, leading to analysing how automatic verification techniques like probabilistic model checking can be exploited in order to formally prove the emergence of desired behaviours when dealing with coordination approaches based on self-organisation. To this end, a concrete case study is presented and discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Free space optical (FSO) communication links can experience extreme signal degradation due to atmospheric turbulence induced spatial and temporal irradiance fuctuations (scintillation) in the laser wavefront. In addition, turbulence can cause the laser beam centroid to wander resulting in power fading, and sometimes complete loss of the signal. Spreading of the laser beam and jitter are also artifacts of atmospheric turbulence. To accurately predict the signal fading that occurs in a laser communication system and to get a true picture of how this affects crucial performance parameters like bit error rate (BER) it is important to analyze the probability density function (PDF) of the integrated irradiance fuctuations at the receiver. In addition, it is desirable to find a theoretical distribution that accurately models these ?uctuations under all propagation conditions. The PDF of integrated irradiance fuctuations is calculated from numerical wave-optic simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to very strong. Our results show that the gamma-gamma PDF provides a good fit to the simulated data distribution for all aperture sizes studied from weak through moderate scintillation. In strong scintillation, the gamma-gamma PDF is a better fit to the distribution for point-like apertures and the lognormal PDF is a better fit for apertures the size of the atmospheric spatial coherence radius ρ0 or larger. In addition, the PDF of received power from a Gaussian laser beam, which has been adaptively compensated at the transmitter before propagation to the receiver of a FSO link in the moderate scintillation regime is investigated. The complexity of the adaptive optics (AO) system is increased in order to investigate the changes in the distribution of the received power and how this affects the BER. For the 10 km link, due to the non-reciprocal nature of the propagation path the optimal beam to transmit is unknown. These results show that a low-order level of complexity in the AO provides a better estimate for the optimal beam to transmit than a higher order for non-reciprocal paths. For the 20 km link distance it was found that, although minimal, all AO complexity levels provided an equivalent improvement in BER and that no AO complexity provided the correction needed for the optimal beam to transmit. Finally, the temporal power spectral density of received power from a FSO communication link is investigated. Simulated and experimental results for the coherence time calculated from the temporal correlation function are presented. Results for both simulation and experimental data show that the coherence time increases as the receiving aperture diameter increases. For finite apertures the coherence time increases as the communication link distance is increased. We conjecture that this is due to the increasing speckle size within the pupil plane of the receiving aperture for an increasing link distance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE To investigate retrograde axonal degeneration for its potential to cause microcystic macular edema (MME), a maculopathy that has been previously described in patients with demyelinating disease. To identify risk factors for MME and to expand the anatomic knowledge on MME. DESIGN Retrospective case series. PARTICIPANTS We included 117 consecutive patients and 180 eyes with confirmed optic neuropathy of variable etiology. Patients with glaucoma were excluded. METHODS We determined age, sex, visual acuity, etiology of optic neuropathy, and the temporal and spatial characteristics of MME. Eyes with MME were compared with eyes with optic neuropathy alone and to healthy fellow eyes. With retinal layer segmentation we quantitatively measured the intraretinal anatomy. MAIN OUTCOME MEASURES Demographic data, distribution of MME in the retina, and thickness of retinal layers were analyzed. RESULTS We found MME in 16 eyes (8.8%) from 9 patients, none of whom had multiple sclerosis or neuromyelitis optica. The MME was restricted to the inner nuclear layer (INL) and had a characteristic perifoveal circular distribution. Compared with healthy controls, MME was associated with significant thinning of the ganglion cell layer and nerve fiber layer, as well as a thickening of the INL and the deeper retinal layers. Youth is a significant risk factor for MME. CONCLUSIONS Microcystic macular edema is not specific for demyelinating disease. It is a sign of optic neuropathy irrespective of its etiology. The distinctive intraretinal anatomy suggests that MME is caused by retrograde degeneration of the inner retinal layers, resulting in impaired fluid resorption in the macula.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A variety of occupational hazards are indigenous to academic and research institutions, ranging from traditional life safety concerns, such as fire safety and fall protection, to specialized occupational hygiene issues such as exposure to carcinogenic chemicals, radiation sources, and infectious microorganisms. Institutional health and safety programs are constantly challenged to establish and maintain adequate protective measures for this wide array of hazards. A unique subset of academic and research institutions are classified as historically Black universities which provide educational opportunities primarily to minority populations. State funded minority schools receive less resources than their non-minority counterparts, resulting in a reduced ability to provide certain programs and services. Comprehensive health and safety services for these institutions may be one of the services compromised, resulting in uncontrolled exposures to various workplace hazards. Such a result would also be contrary to the national health status objectives to improve preventive health care measures for minority populations.^ To determine if differences exist, a cross-sectional survey was performed to evaluate the relative status of health and safety programs present within minority and non-minority state-funded academic and research institutions. Data were obtained from direct mail questionnaires, supplemented by data from publicly available sources. Parameters for comparison included reported numbers of full and part-time health and safety staff, reported OSHA 200 log (or equivalent) values, and reported workers compensation experience modifiers. The relative impact of institutional minority status, institution size, and OSHA regulatory environment, was also assessed. Additional health and safety program descriptors were solicited in an attempt to develop a preliminary profile of the hazards present in this unique work setting.^ Survey forms were distributed to 24 minority and 51 non-minority institutions. A total of 72% of the questionnaires were returned, with 58% of the minority and 78% of the non-minority institutions participating. The mean number of reported full-time health and safety staff for the responding minority institutions was determined to be 1.14, compared to 3.12 for the responding non-minority institutions. Data distribution variances were stabilized using log-normal transformations, and although subsequent analysis indicated statistically significant differences, the differences were found to be predicted by institution size only, and not by minority status or OSHA regulatory environment. Similar results were noted for estimated full-time equivalent health and safety staffing levels. Significant differences were not noted between reported OSHA 200 log (or equivalent) data, and a lack of information provided on workers compensation experience modifiers prevented comparisons on insurance premium expenditures. Other health and safety program descriptive information obtained served to validate the study's presupposition that the inclusion criteria would encompass those organizations with occupational risks from all four major hazard categories. Worker medical surveillance programs appeared to exist at most institutions, but the specific tests completed were not readily identifiable.^ The results of this study serve as a preliminary description of the health and safety programs for a unique set of workplaces have not been previously investigated. Numerous opportunities for further research are noted, including efforts to quantify the relative amount of each hazard present, the further definition of the programs reported to be in place, determination of other means to measure health outcomes on campuses, and comparisons among other culturally diverse workplaces. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Postcruise X-ray diffraction (XRD) data for 95 whole-rock samples from Holes 1188A, 1188F, 1189A, and 1189B are presented. The samples represent alteration types recovered during Leg 193. The data set is incorporated into the shipboard XRD data set. Based on the newly obtained XRD data, distribution of alteration phases were redrawn for Ocean Drilling Program Sites 1188 and 1189.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Self-OrganizingMap (SOM) is a neural network model that performs an ordered projection of a high dimensional input space in a low-dimensional topological structure. The process in which such mapping is formed is defined by the SOM algorithm, which is a competitive, unsupervised and nonparametric method, since it does not make any assumption about the input data distribution. The feature maps provided by this algorithm have been successfully applied for vector quantization, clustering and high dimensional data visualization processes. However, the initialization of the network topology and the selection of the SOM training parameters are two difficult tasks caused by the unknown distribution of the input signals. A misconfiguration of these parameters can generate a feature map of low-quality, so it is necessary to have some measure of the degree of adaptation of the SOM network to the input data model. The topologypreservation is the most common concept used to implement this measure. Several qualitative and quantitative methods have been proposed for measuring the degree of SOM topologypreservation, particularly using Kohonen's model. In this work, two methods for measuring the topologypreservation of the Growing Cell Structures (GCSs) model are proposed: the topographic function and the topology preserving map

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Natural regeneration in stone pine (Pinus pinea L.) managed forests in the Spanish Northern Plateau is not achieved successfully under current silviculture practices, constituting a main concern for forest managers. We modelled spatio-temporal features of primary dispersal to test whether (a) present low stand densities constrain natural regeneration success and (b) seed release is a climate-controlled process. The present study is based on data collected from a 6 years seed trap experiment considering different regeneration felling intensities. From a spatial perspective, we attempted alternate established kernels under different data distribution assumptions to fit a spatial model able to predict P. pinea seed rain. Due to P. pinea umbrella-like crown, models were adapted to account for crown effect through correction of distances between potential seed arrival locations and seed sources. In addition, individual tree fecundity was assessed independently from existing models, improving parameter estimation stability. Seed rain simulation enabled to calculate seed dispersal indexes for diverse silvicultural regeneration treatments. The selected spatial model of best fit (Weibull, Poisson assumption) predicted a highly clumped dispersal pattern that resulted in a proportion of gaps where no seed arrival is expected (dispersal limitation) between 0.25 and 0.30 for intermediate intensity regeneration fellings and over 0.50 for intense fellings. To describe the temporal pattern, the proportion of seeds released during monthly intervals was modelled as a function of climate variables – rainfall events – through a linear model that considered temporal autocorrelation, whereas cone opening took place over a temperature threshold. Our findings suggest the application of less intensive regeneration fellings, to be carried out after years of successful seedling establishment and, seasonally, subsequent to the main rainfall period (late fall). This schedule would avoid dispersal limitation and would allow for a complete seed release. These modifications in present silviculture practices would produce a more efficient seed shadow in managed stands.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El objetivo final de las investigaciones recogidas en esta tesis doctoral es la estimación del volumen de hielo total de los ms de 1600 glaciares de Svalbard, en el Ártico, y, con ello, su contribución potencial a la subida del nivel medio del mar en un escenario de calentamiento global. Los cálculos más exactos del volumen de un glaciar se efectúan a partir de medidas del espesor de hielo obtenidas con georradar. Sin embargo, estas medidas no son viables para conjuntos grandes de glaciares, debido al coste, dificultades logísticas y tiempo requerido por ellas, especialmente en las regiones polares o de montaña. Frente a ello, la determinación de áreas de glaciares a partir de imágenes de satélite sí es viable a escalas global y regional, por lo que las relaciones de escala volumen-área constituyen el mecanismo más adecuado para las estimaciones de volúmenes globales y regionales, como las realizadas para Svalbard en esta tesis. Como parte del trabajo de tesis, hemos elaborado un inventario de los glaciares de Svalbard en los que se han efectuado radioecosondeos, y hemos realizado los cálculos del volumen de hielo de más de 80 cuencas glaciares de Svalbard a partir de datos de georradar. Estos volúmenes han sido utilizados para calibrar las relaciones volumen-área desarrolladas en la tesis. Los datos de georradar han sido obtenidos en diversas campañas llevadas a cabo por grupos de investigación internacionales, gran parte de ellas lideradas por el Grupo de Simulación Numérica en Ciencias e Ingeniería de la Universidad Politécnica de Madrid, del que forman parte la doctoranda y los directores de tesis. Además, se ha desarrollado una metodología para la estimación del error en el cálculo de volumen, que aporta una novedosa técnica de cálculo del error de interpolación para conjuntos de datos del tipo de los obtenidos con perfiles de georradar, que presentan distribuciones espaciales con unos patrones muy característicos pero con una densidad de datos muy irregular. Hemos obtenido en este trabajo de tesis relaciones de escala específicas para los glaciares de Svalbard, explorando la sensibilidad de los parámetros a diferentes morfologías glaciares, e incorporando nuevas variables. En particular, hemos efectuado experimentos orientados a verificar si las relaciones de escala obtenidas caracterizando los glaciares individuales por su tamaño, pendiente o forma implican diferencias significativas en el volumen total estimado para los glaciares de Svalbard, y si esta partición implica algún patrón significativo en los parámetros de las relaciones de escala. Nuestros resultados indican que, para un valor constante del factor multiplicativo de la relacin de escala, el exponente que afecta al área en la relación volumen-área decrece según aumentan la pendiente y el factor de forma, mientras que las clasificaciones basadas en tamaño no muestran un patrón significativo. Esto significa que los glaciares con mayores pendientes y de tipo circo son menos sensibles a los cambios de área. Además, los volúmenes de la población total de los glaciares de Svalbard calculados con fraccionamiento en grupos por tamaño y pendiente son un 1-4% menores que los obtenidas usando la totalidad de glaciares sin fraccionamiento en grupos, mientras que los volúmenes calculados fraccionando por forma son un 3-5% mayores. También realizamos experimentos multivariable para obtener estimaciones óptimas del volumen total mediante una combinación de distintos predictores. Nuestros resultados muestran que un modelo potencial simple volumen-área explica el 98.6% de la varianza. Sólo el predictor longitud del glaciar proporciona significación estadística cuando se usa además del área del glaciar, aunque el coeficiente de determinación disminuye en comparación con el modelo más simple V-A. El predictor intervalo de altitud no proporciona información adicional cuando se usa además del área del glaciar. Nuestras estimaciones del volumen de la totalidad de glaciares de Svalbard usando las diferentes relaciones de escala obtenidas en esta tesis oscilan entre 6890 y 8106 km3, con errores relativos del orden de 6.6-8.1%. El valor medio de nuestras estimaciones, que puede ser considerado como nuestra mejor estimación del volumen, es de 7.504 km3. En términos de equivalente en nivel del mar (SLE), nuestras estimaciones corresponden a una subida potencial del nivel del mar de 17-20 mm SLE, promediando 19_2 mm SLE, donde el error corresponde al error en volumen antes indicado. En comparación, las estimaciones usando las relaciones V-A de otros autores son de 13-26 mm SLE, promediando 20 _ 2 mm SLE, donde el error representa la desviación estándar de las distintas estimaciones. ABSTRACT The final aim of the research involved in this doctoral thesis is the estimation of the total ice volume of the more than 1600 glaciers of Svalbard, in the Arctic region, and thus their potential contribution to sea-level rise under a global warming scenario. The most accurate calculations of glacier volumes are those based on ice-thicknesses measured by groundpenetrating radar (GPR). However, such measurements are not viable for very large sets of glaciers, due to their cost, logistic difficulties and time requirements, especially in polar or mountain regions. On the contrary, the calculation of glacier areas from satellite images is perfectly viable at global and regional scales, so the volume-area scaling relationships are the most useful tool to determine glacier volumes at global and regional scales, as done for Svalbard in this PhD thesis. As part of the PhD work, we have compiled an inventory of the radio-echo sounded glaciers in Svalbard, and we have performed the volume calculations for more than 80 glacier basins in Svalbard from GPR data. These volumes have been used to calibrate the volume-area relationships derived in this dissertation. Such GPR data have been obtained during fieldwork campaigns carried out by international teams, often lead by the Group of Numerical Simulation in Science and Engineering of the Technical University of Madrid, to which the PhD candidate and her supervisors belong. Furthermore, we have developed a methodology to estimate the error in the volume calculation, which includes a novel technique to calculate the interpolation error for data sets of the type produced by GPR profiling, which show very characteristic data distribution patterns but with very irregular data density. We have derived in this dissertation scaling relationships specific for Svalbard glaciers, exploring the sensitivity of the scaling parameters to different glacier morphologies and adding new variables. In particular, we did experiments aimed to verify whether scaling relationships obtained through characterization of individual glacier shape, slope and size imply significant differences in the estimated volume of the total population of Svalbard glaciers, and whether this partitioning implies any noticeable pattern in the scaling relationship parameters. Our results indicate that, for a fixed value of the factor in the scaling relationship, the exponent of the area in the volume-area relationship decreases as slope and shape increase, whereas size-based classifications do not reveal any clear trend. This means that steep slopes and cirque-type glaciers are less sensitive to changes in glacier area. Moreover, the volumes of the total population of Svalbard glaciers calculated according to partitioning in subgroups by size and slope are smaller (by 1-4%) than that obtained considering all glaciers without partitioning into subgroups, whereas the volumes calculated according to partitioning in subgroups by shape are 3-5% larger. We also did multivariate experiments attempting to optimally predict the volume of Svalbard glaciers from a combination of different predictors. Our results show that a simple power-type V-A model explains 98.6% of the variance. Only the predictor glacier length provides statistical significance when used in addition to the predictor glacier area, though the coefficient of determination decreases as compared with the simpler V-A model. The predictor elevation range did not provide any additional information when used in addition to glacier area. Our estimates of the volume of the entire population of Svalbard glaciers using the different scaling relationships that we have derived along this thesis range within 6890-8106 km3, with estimated relative errors in total volume of the order of 6.6-8.1% The average value of all of our estimates, which could be used as a best estimate for the volume, is 7,504 km3. In terms of sea-level equivalent (SLE), our volume estimates correspond to a potential contribution to sea-level rise within 17-20 mm SLE, averaging 19 _ 2 mm SLE, where the quoted error corresponds to our estimated relative error in volume. For comparison, the estimates using the V-A scaling relations found in the literature range within 13-26 mm SLE, averaging 20 _ 2 mm SLE, where the quoted error represents the standard deviation of the different estimates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A estimulação elétrica neuromuscular (EENM) é uma recente técnica terapêutica no tratamento das disfagias orofaríngeas. Poucos estudos utilizaram a EENM em casos oncológicos, havendo muitas dúvidas sobre o método de aplicação e os resultados de diferentes condições de estimulação nessa população. Este trabalho teve por objetivo verificar o efeito imediato da EENM sensorial e motora, nas fases oral e faríngea da deglutição, em pacientes após tratamento do câncer de cabeça e pescoço. Para isso foi realizado um estudo transversal intervencional que incluiu 11 pacientes adultos e idosos (mediana de 59 anos) acometidos por câncer de cabeça e pescoço. Todos os indivíduos foram submetidos ao exame de videofluoroscopia da deglutição, no qual, de modo randomizado, foram solicitadas deglutições de 5 ml de alimentos nas consistências líquida, mel e pudim em três condições distintas: sem estimulação, com EENM sensorial, com EENM motora. Foi classificado o grau da disfunção da deglutição por meio da escala DOSS (Dysphagia Outcome and Severity Scale), a presença de estase de alimentos (escala de Eisenhuber), de penetração laríngea, aspiração laringotraqueal (Penetration and Aspiration Scale - PAS), além da medida do tempo de trânsito oral e faríngeo (em segundos). Para a comparação dos resultados, considerando os três estímulos aplicados, na escala de resíduos, na escala de penetração aspiração, na escala DOSS e no tempo de trânsito oral e faríngeo foi aplicado o teste de Friedman ou a análise de variância para medidas repetidas (de acordo com a distribuição dos dados). Para todos os testes foi adotado nível de significância de 5%. Os resultados demonstraram que houve melhora com a estimulação sensorial e motora na escala DOSS e na escala PAS para um paciente tratado de câncer de boca e outro de laringe e piora, em ambas as escalas, para dois pacientes (câncer de boca), sendo um para a estimulação motora e outro na sensorial. A aplicação da escala de Eisenhuber permitiu verificar que a EENM, tanto em nível sensorial como motor, modificou de forma variável a presença de resíduos para os casos de câncer de boca, enquanto para o paciente com câncer de laringe houve redução de resíduos em valécula/raiz da língua para a estimulação sensorial e motora, além de aumento de resíduos em parede posterior da faringe com o estímulo motor. Além disso, não foi encontrada diferença estatisticamente significante para o tempo de trânsito oral e faríngeo nas diferentes estimulações para todas as consistências testadas (p>0,05). Diante desses achados, concluiu-se que a EENM, em nível sensorial e motor, apresentou variável impacto imediato nas fases oral e faríngea da deglutição, podendo melhorar a função de deglutição de pacientes com significante disfagia após o tratamento para o câncer de cabeça e pescoço, no que se diz respeito ao grau da disfagia e à presença de penetração e aspiração.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, parallel Relaxed and Extrapolated algorithms based on the Power method for accelerating the PageRank computation are presented. Different parallel implementations of the Power method and the proposed variants are analyzed using different data distribution strategies. The reported experiments show the behavior and effectiveness of the designed algorithms for realistic test data using either OpenMP, MPI or an hybrid OpenMP/MPI approach to exploit the benefits of shared memory inside the nodes of current SMP supercomputers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Virtual learning environments (VLEs) have witnessed a high evolution, namely regarding their potentialities, the tools and the activities they provide. VLEs enable us to access large quantities of data resulting from both students and teachers’ activities developed in those environments. Monitoring undergraduates’ activities in VLEs is important as it allows us to showcase, in a structured way, a number of indicators which may be taken into account to understand the learning process more deeply and to propose improvements in the teaching and learning strategies as well as in the institution’s virtual environment. Although VLEs provide several data sectorial statistics, they do not provide knowledge regarding the institution’s evolution. Therefore, we consider the analysis of the activity logs in VLEs over a period of five years to be paramount. This paper focuses on the analysis of the activities developed by students in a virtual learning environment, from a sample of undergraduate students, approximately 7000 per year, over a period of five academic years, namely from 2009/2010 to 2013/2014. The main aims of this research work are to assess the evolution of activity logs in the virtual learning environment of a Portuguese public higher education institution, in order to fill possible gaps and to hold out the prospect of new forms of use of the environment. The results obtained from the data analysis show that overall, the number of accesses to the virtual learning environment increased over the five years under study. The most used tools were Resources, Messages and Assignments. The most frequent activities developed with these tools were respectively consulting information, sending messages and submitting assignments. The frequency of accesses to the virtual learning environment was characterized according to the number of accesses in the activity log. The data distribution was divided into five frequency categories named very low, low, moderate, high and very high, determined by the percentiles 20, 40, 60, 80 and 100, respectively. The study of activity logs of virtual learning environments is important not only because they provide real knowledge of the use that undergraduates make of these environments, but also because of the possibilities they create regarding the identification of a need for new pedagogical approaches or a reinforcement of previously consolidated approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Long-term changes in the beach fauna at Duck, North Carolina, were investigated. Twenty-one stations located on three transects on the oceanside and twenty-four stations located on three transects on the sound side were sampled seasonally from November 1980 to July 1981. The data collected in this study were compared to a previous study conducted in 1976 (Matta, 1977) to investigate the potential effects of the construction of the CERC Field Research Facility pier on the adjacent beaches. No effects on the benthic fauna were found. Changes observed in the benthic macrofauna on the ocean beaches were well within the range attributable to the natural variation of an open coast system. The ocean beach macrofauna was observed to form a single community migrating on an off the beach with the seasons. On the sound beaches, changes were detected in the benthic macrofauna; however, these were attributed to a salinity increase during the 1981 sampling year. (Author).