993 resultados para 796.015


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To evaluate the clinical and radiologic response of patients with Graves' ophthalmopathy given low-dose orbital radiotherapy (RT) with a protracted fractionation.Methods and Materials: Eighteen patients (36 orbits) received orbital RT with a total dose of 10 Gy, fractionated in 1 Gy once a week over 10 weeks. of these, 9 patients received steroid therapy as well. Patients were evaluated clinically and radiologically at 6 months after treatment. Clinical response assessment was carried out using three criteria: by physical examination, by a modified clinical activity score, and by a verbal questionnaire considering the 10 most common signs and symptoms of the disease. Radiologic response was assessed by magnetic resonance imaging.Results: Improvement in ocular pain, palpebral edema, visual acuity, and ocular motility was observed in all patients. Significant decrease in symptoms such as tearing (p < 0.001) diplopia (p = 0.008), conjunctival hyperemia (p = 0.002), and ocular grittiness (p = 0.031) also occurred. Magnetic resonance imaging showed decrease in ocular muscle thickness and in the intensity of the T2 sequence signal in the majority of patients. Treatments were well tolerated, and to date no complications from treatment have been observed. There was no statistical difference in clinical and radiologic response between patients receiving RT alone and those receiving RT plus steroid therapy.Conclusion: RT delivered in at a low dose and in a protracted scheme should be considered as a useful therapeutic option for patients with Graves' ophthalmopathy. (C) 2012 Elsevier Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The large-scale production of cardiomyocytes is a key step in the development of cell therapy and tissue engineering to treat cardiovascular diseases, particularly those caused by ischemia. the main objective of this study was to establish a procedure for the efficient production of cardiomyocytes by reprogramming mesenchymal stem cells from adipose tissue. First, lentiviral vectors expressing neoR and GFP under the control of promoters expressed specifically during cardiomyogenesis were constructed to monitor cell reprogramming into precardiomyocytes and to select cells for amplification and characterization. Cellular reprogramming was performed using 5'-azacytidine followed by electroporation with plasmid pOKS2a, which expressed Oct4, Sox2, and Klf4. Under these conditions, GFP expression began only after transfection with pOKS2a, and less than 0.015% of cells were GFP(+). These GFP(+) cells were selected for G418 resistance to find molecular markers of cardiomyocytes by RT-PCR and immunocytochemistry. Both genetic and protein markers of cardiomyocytes were present in the selected cells, with some variations among them. Cell doubling time did not change after selection. Together, these results indicate that enrichment with vectors expressing GFP and neoR under cardiomyocyte-specific promoters can produce large numbers of cardiomyocyte precursors (CMPs), which can then be differentiated terminally for cell therapy and tissue engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Offshore wind turbines supported on monopile foundations are dynamically sensitive because the overall natural frequencies of these structures are close to the different forcing frequencies imposed upon them. The structures are designed for an intended life of 25 to 30 years, but little is known about their long term behaviour. To study their long term behaviour, a series of laboratory tests were conducted in which a scaled model wind turbine supported on a monopile in kaolin clay was subjected to between 32,000 and 172,000 cycles of horizontal loading and the changes in natural frequency and damping of the model were monitored. The experimental results are presented using a non-dimensional framework based on an interpretation of the governing mechanics. The change in natural frequency was found to be strongly dependent on the shear strain level in the soil next to the pile. Practical guidance for choosing the diameter of monopile is suggested based on element test results using the concept of volumetric threshold shear strain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The appropriation of digital artefacts involves their use, which has changed, evolved or developed beyond their original design. Thus, to understand appropriation, we must understand use. We define use as the active, purposive exploitation of the affordances offered by the technology and from this perspective; appropriation emerges as a natural consequence of this enactive use. Enaction tells us that perception is an active process. It is something we do, and not something that happens to us. From this reading, use then becomes the active exploitation of the affordances offered us by the artefact, system or service. In turn, we define appropriation as the engagement with these actively disclosed affordances—disclosed as a consequence of, not just, seeing but of seeing as. We present a small case study that highlights instances of perception as an actively engaged skill. We conclude that appropriation is a simple consequence of enactive perception.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Good blood pressure (BP) control reduces the risk of recurrence of stroke/transient ischaemic attack (TIA). Although there is strong evidence that BP telemonitoring helps achieve good control, none of the major trials have considered the effectiveness in stroke/TIA survivors. We therefore conducted a feasibility study for a trial of BP telemonitoring for stroke/ TIA survivors with uncontrolled BP in primary care. Method Phase 1 was a pilot trial involving 55 patients stratified by stroke/TIA randomised 3:1 to BP telemonitoring for 6 months or usual care. Phase 2 was a qualitative evaluation and comprised semi-structured interviews with 16 trial participants who received telemonitoring and 3 focus groups with 23 members of stroke support groups and 7 carers. Results Overall, 125 patients (60 stroke patients, 65 TIA patients) were approached and 55 (44%) patients were randomised including 27 stroke patients and 28 TIA patients. Fifty-two participants (95%) attended the 6-month follow-up appointment, but one declined the second daytime ambulatory blood pressure monitoring (ABPM) measurement resulting in a 93% completion rate for ABPM − the proposed primary outcome measure for a full trial. Adherence to telemonitoring was good; of the 40 participants who were telemonitoring, 38 continued to provide readings throughout the 6 months. There was a mean reduction of 10.1 mmHg in systolic ABPM in the telemonitoring group compared with 3.8 mmHg in the control group, which suggested the potential for a substantial effect from telemonitoring. Our qualitative analysis found that many stroke patients were concerned about their BP and telemonitoring increased their engagement, was easy, convenient and reassuring Conclusions A full-scale trial is feasible, likely to recruit well and have good rates of compliance and follow-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Increasing prevalence of overweight and obesity represents a global pandemic. As the largest occupational group in international healthcare systems nurses are at the forefront of health promotion to address this pandemic. However, nurses own health behaviours are known to influence the extent to which they engage in health promotion and the public's confidence in advice offered. Estimating the prevalence of overweight and obesity among nurses is therefore important. However, to date, prevalence estimates have been based on non-representative samples and internationally no studies have compared prevalence of overweight and obesity among nurses to other healthcare professionals using representative data. Objectives To estimate overweight and obesity prevalence among nurses in Scotland, and compare to other healthcare professionals and those working in non-heath related occupations. Design Cross-sectional study using a nationally representative sample of five aggregated annual rounds (2008-2012) of the Scottish Health Survey. Setting Scotland. Participants: 13,483 adults aged 17 to 65 indicating they had worked in the past 4 weeks, classified in four occupational groups: nurses (n = 411), other healthcare professionals (n = 320), unqualified care staff (n = 685), and individuals employed in non-health related occupations (n = 12,067). Main outcome measures: Prevalence of overweight and obesity defined as Body Mass Index ≥ 25.0. Methods Estimates of overweight and obesity prevalence in each occupational group were calculated with 95% confidence intervals (CI). A logistic regression model was then built to compare the odds of being overweight or obese with not being overweight or obese for nurses in comparison to the other occupational categories. Data were analysed using SAS 9.1.3. Results 69.1% (95% CI 64.6,73.6) of Scottish nurses were overweight or obese. Prevalence of overweight and obesity was higher in nurses than other healthcare professionals (51.3%, CI 45.8,56.7), unqualified care staff (68.5%, CI 65.0,72.0) and those in non-health related occupations (68.9%, CI 68.1,69.7). A logistic regression model adjusted for socio-demographic composition indicated that, compared to nurses, the odds of being overweight or obese was statistically significantly lower for other healthcare professionals (Odds Ratio [OR] 0.45, CI 0.33,0.61) and those in non-health related occupations (OR 0.78, CI 0.62,0.97). Conclusions Prevalence of overweight and obesity among Scottish nurses is worryingly high, and significantly higher than those in other healthcare professionals and non-health related occupations. High prevalence of overweight and obesity potentially harms nurses’ own health and hampers the effectiveness of nurses’ health promotion role. Interventions are therefore urgently required to address overweight and obesity among the Scottish nursing workforce.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The influence of process variables (pea starch, guar gum and glycerol) on the viscosity (V), solubility (SOL), moisture content (MC), transparency (TR), Hunter parameters (L, a, and b), total color difference (ΔE), yellowness index (YI), and whiteness index (WI) of the pea starch based edible films was studied using three factors with three level Box–Behnken response surface design. The individual linear effect of pea starch, guar and glycerol was significant (p < 0.05) on all the responses. However, a value was only significantly (p < 0.05) affected by pea starch and guar gum in a positive and negative linear term, respectively. The effect of interaction of starch × glycerol was also significant (p < 0.05) on TR of edible films. Interaction between independent variables starch × guar gum had a significant impact on the b and YI values. The quadratic regression coefficient of pea starch showed a significant effect (p < 0.05) on V, MC, L, b, ΔE, YI, and WI; glycerol level on ΔE and WI; and guar gum on ΔE and SOL value. The results were analyzed by Pareto analysis of variance (ANOVA) and the second order polynomial models were developed from the experimental design with reliable and satisfactory fit with the corresponding experimental data and high coefficient of determination (R2) values (>0.93). Three-dimensional response surface plots were established to investigate the relationship between process variables and the responses. The optimized conditions with the goal of maximizing TR and minimizing SOL, YI and MC were 2.5 g pea starch, 25% glycerol and 0.3 g guar gum. Results revealed that pea starch/guar gum edible films with appropriate physical and optical characteristics can be effectively produced and successfully applied in the food packaging industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ingeniería Agronómica

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first part of this paper we reviewed the fingerprint classification literature from two different perspectives: the feature extraction and the classifier learning. Aiming at answering the question of which among the reviewed methods would perform better in a real implementation we end up in a discussion which showed the difficulty in answering this question. No previous comparison exists in the literature and comparisons among papers are done with different experimental frameworks. Moreover, the difficulty in implementing published methods was stated due to the lack of details in their description, parameters and the fact that no source code is shared. For this reason, in this paper we will go through a deep experimental study following the proposed double perspective. In order to do so, we have carefully implemented some of the most relevant feature extraction methods according to the explanations found in the corresponding papers and we have tested their performance with different classifiers, including those specific proposals made by the authors. Our aim is to develop an objective experimental study in a common framework, which has not been done before and which can serve as a baseline for future works on the topic. This way, we will not only test their quality, but their reusability by other researchers and will be able to indicate which proposals could be considered for future developments. Furthermore, we will show that combining different feature extraction models in an ensemble can lead to a superior performance, significantly increasing the results obtained by individual models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The final publication is available at Springer via http://dx.doi.org/10.1007/s10693-015-0230-1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the power of genetic algorithms at solving the MAX-CLIQUE problem. We measure the performance of a standard genetic algorithm on an elementary set of problem instances consisting of embedded cliques in random graphs. We indicate the need for improvement, and introduce a new genetic algorithm, the multi-phase annealed GA, which exhibits superior performance on the same problem set. As we scale up the problem size and test on \hard" benchmark instances, we notice a degraded performance in the algorithm caused by premature convergence to local minima. To alleviate this problem, a sequence of modi cations are implemented ranging from changes in input representation to systematic local search. The most recent version, called union GA, incorporates the features of union cross-over, greedy replacement, and diversity enhancement. It shows a marked speed-up in the number of iterations required to find a given solution, as well as some improvement in the clique size found. We discuss issues related to the SIMD implementation of the genetic algorithms on a Thinking Machines CM-5, which was necessitated by the intrinsically high time complexity (O(n3)) of the serial algorithm for computing one iteration. Our preliminary conclusions are: (1) a genetic algorithm needs to be heavily customized to work "well" for the clique problem; (2) a GA is computationally very expensive, and its use is only recommended if it is known to find larger cliques than other algorithms; (3) although our customization e ort is bringing forth continued improvements, there is no clear evidence, at this time, that a GA will have better success in circumventing local minima.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe our work on shape-based image database search using the technique of modal matching. Modal matching employs a deformable shape decomposition that allows users to select example objects and have the computer efficiently sort the set of objects based on the similarity of their shape. Shapes are compared in terms of the types of nonrigid deformations (differences) that relate them. The modal decomposition provides deformation "control knobs" for flexible matching and thus allows for selecting weighted subsets of shape parameters that are deemed significant for a particular category or context. We demonstrate the utility of this approach for shape comparison in 2-D image databases; however, the general formulation is applicable to signals of any dimensionality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of architecting a reliable content delivery system across an overlay network using TCP connections as the transport primitive. We first argue that natural designs based on store-and-forward principles that tightly couple TCP connections at intermediate end-systems impose fundamental performance limitations, such as dragging down all transfer rates in the system to the rate of the slowest receiver. In contrast, the ROMA architecture we propose incorporates the use of loosely coupled TCP connections together with fast forward error correction techniques to deliver a scalable solution that better accommodates a set of heterogeneous receivers. The methods we develop establish chains of TCP connections, whose expected performance we analyze through equation-based methods. We validate our analytical findings and evaluate the performance of our ROMA architecture using a prototype implementation via extensive Internet experimentation across the PlanetLab distributed testbed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many multi-camera vision systems the effect of camera locations on the task-specific quality of service is ignored. Researchers in Computational Geometry have proposed elegant solutions for some sensor location problem classes. Unfortunately, these solutions utilize unrealistic assumptions about the cameras' capabilities that make these algorithms unsuitable for many real-world computer vision applications: unlimited field of view, infinite depth of field, and/or infinite servo precision and speed. In this paper, the general camera placement problem is first defined with assumptions that are more consistent with the capabilities of real-world cameras. The region to be observed by cameras may be volumetric, static or dynamic, and may include holes that are caused, for instance, by columns or furniture in a room that can occlude potential camera views. A subclass of this general problem can be formulated in terms of planar regions that are typical of building floorplans. Given a floorplan to be observed, the problem is then to efficiently compute a camera layout such that certain task-specific constraints are met. A solution to this problem is obtained via binary optimization over a discrete problem space. In experiments the performance of the resulting system is demonstrated with different real floorplans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently the notion of self-similarity has been shown to apply to wide-area and local-area network traffic. In this paper we examine the mechanisms that give rise to self-similar network traffic. We present an explanation for traffic self-similarity by using a particular subset of wide area traffic: traffic due to the World Wide Web (WWW). Using an extensive set of traces of actual user executions of NCSA Mosaic, reflecting over half a million requests for WWW documents, we show evidence that WWW traffic is self-similar. Then we show that the self-similarity in such traffic can be explained based on the underlying distributions of WWW document sizes, the effects of caching and user preference in file transfer, the effect of user "think time", and the superimposition of many such transfers in a local area network. To do this we rely on empirically measured distributions both from our traces and from data independently collected at over thirty WWW sites.