73 resultados para LEVEL SET METHODS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The aim of the SPHERE study is to design, implement and evaluate tailored practice and personal care plans to improve the process of care and objective clinical outcomes for patients with established coronary heart disease (CHD) in general practice across two different health systems on the island of Ireland.CHD is a common cause of death and a significant cause of morbidity in Ireland. Secondary prevention has been recommended as a key strategy for reducing levels of CHD mortality and general practice has been highlighted as an ideal setting for secondary prevention initiatives. Current indications suggest that there is considerable room for improvement in the provision of secondary prevention for patients with established heart disease on the island of Ireland. The review literature recommends structured programmes with continued support and follow-up of patients; the provision of training, tailored to practice needs of access to evidence of effectiveness of secondary prevention; structured recall programmes that also take account of individual practice needs; and patient-centred consultations accompanied by attention to disease management guidelines.

Methods: SPHERE is a cluster randomised controlled trial, with practice-level randomisation to intervention and control groups, recruiting 960 patients from 48 practices in three study centres (Belfast, Dublin and Galway). Primary outcomes are blood pressure, total cholesterol, physical and mental health status (SF-12) and hospital re-admissions. The intervention takes place over two years and data is collected at baseline, one-year and two-year follow-up. Data is obtained from medical charts, consultations with practitioners, and patient postal questionnaires. The SPHERE intervention involves the implementation of a structured systematic programme of care for patients with CHD attending general practice. It is a multi-faceted intervention that has been developed to respond to barriers and solutions to optimal secondary prevention identified in preliminary qualitative research with practitioners and patients. General practitioners and practice nurses attend training sessions in facilitating behaviour change and medication prescribing guidelines for secondary prevention of CHD. Patients are invited to attend regular four-monthly consultations over two years, during which targets and goals for secondary prevention are set and reviewed. The analysis will be strengthened by economic, policy and qualitative components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel application-specific instruction set processor (ASIP) for use in the construction of modern signal processing systems is presented. This is a flexible device that can be used in the construction of array processor systems for the real-time implementation of functions such as singular-value decomposition (SVD) and QR decomposition (QRD), as well as other important matrix computations. It uses a coordinate rotation digital computer (CORDIC) module to perform arithmetic operations and several approaches are adopted to achieve high performance including pipelining of the micro-rotations, the use of parallel instructions and a dual-bus architecture. In addition, a novel method for scale factor correction is presented which only needs to be applied once at the end of the computation. This also reduces computation time and enhances performance. Methods are described which allow this processor to be used in reduced dimension (i.e., folded) array processor structures that allow tradeoffs between hardware and performance. The net result is a flexible matrix computational processing element (PE) whose functionality can be changed under program control for use in a wider range of scenarios than previous work. Details are presented of the results of a design study, which considers the application of this decomposition PE architecture in a combined SVD/QRD system and demonstrates that a combination of high performance and efficient silicon implementation are achievable. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the ‘utilization’ which is defined as the percentage of available ‘seat-hours’ that are employed. Within real institutions, studies have shown that this utilization can often take values as low as 20–40%. One consequence of such a low level of utilization is that space managers are under pressure to make more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. This is accompanied, within space planning (long-term planning) by a lack of experise on how best to accommodate the expected low utilizations. This motivates our two main goals: (i) To understand the factors that drive down utilizations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilizations seen in reality. Furthermore, on considering the decision question ‘Can this given set of courses all be allocated in the available teaching space?’ we find that the answer depends on the associated utilization in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is ‘almost always yes’ and those of ‘almost always no’. Through analysis and understanding of the space of potential solutions, our work suggests that better use of space within universities will come about through an understanding of the effects of timetabling constraints and when it is statistically likely that it will be possible for a set of courses to be allocated to a particular space. The results presented here provide a firm foundation for university managers to take decisions on how space should be managed and planned for more effectively. Our multi-criteria approach and new methodology together provide new insight into the interaction between the course timetabling problem and the crucial issue of space planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To investigate whether failure to suppress the prostate-specific antigen (PSA) level to /=2 months of neoadjuvant luteinizing hormone-releasing hormone agonist therapy in patients scheduled to undergo external beam radiotherapy for localized prostate carcinoma is associated with reduced biochemical failure-free survival. METHODS AND MATERIALS: A retrospective case note review of consecutive patients with intermediate- or high-risk localized prostate cancer treated between January 2001 and December 2002 with neoadjuvant hormonal deprivation therapy, followed by concurrent hormonal therapy and radiotherapy was performed. Patient data were divided for analysis according to whether the PSA level in Week 1 of radiotherapy was 1 ng/mL in 52. At a median follow-up of 49 months, the 4-year actuarial biochemical failure-free survival rate was 84% vs. 60% (p = 0.0016) in favor of the patients with a PSA level after neoadjuvant hormonal deprivation therapy of 1 ng/mL at the beginning of external beam radiotherapy after >/=2 months of neoadjuvant luteinizing hormone-releasing hormone agonist therapy have a significantly greater rate of biochemical failure and lower survival rate compared with those with a PSA level of

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Post-renal transplant anaemia is a potentially reversible cardiovascular risk factor. Graft function, immunosuppressive agents and inhibition of the renin-angiotensin system have been implicated in its aetiology. The evaluation of erythropoietin (EPO) levels may contribute to understanding the relative contributions of these factors. Methods. Two-hundred and seven renal transplant recipients attending the Belfast City Hospital were studied. Clinical and laboratory data were extracted from the medical records and laboratory systems. Results. Of the 207 patients (126 male), 47 (22.7%) were found to be anaemic (males, haemoglobin (Hb) <12 g/dl, females Hb <11g/dl). The anaemic group had a significantly higher mean serum creatinine level (162.8 µmol/l vs 131.0 µmol/l, P <0.001) and lower mean estimated glomerular filtration rate (eGFR) (41.5 ml/min vs 54.9 ml/min, P <0.001) than the non-anaemic group. Individual immunosuppressive regimens were comparable between those with and those without anaemia. Angiotensin converting enzyme inhibitor (ACE-I) or angiotensin receptor blocker (ARB) administration was not more prevalent in those with anaemia compared with those without (36.2 vs 38.8, P = 0.88). There was a significant inverse correlation between Hb levels and serum EPO levels (R = -0.29, P <0.001), but not between EPO levels and eGFR (R = 0.02, P = 0.74). Higher EPO levels were predictive of anaemia, independent of eGFR in multivariate analysis. Conclusion. Anaemia is common in post-renal transplant patients. The levels of renal function and serum EPO and not immunosuppressive regimens or ACE-I/ARB use, are strong and independent predictors of anaemia. © The Author [2007]. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims.We aim to provide the atmospheric parameters and rotational velocities for a large sample of O- and early B-type stars, analysed in a homogeneous and consistent manner, for use in constraining theoretical models. Methods: Atmospheric parameters, stellar masses, and rotational velocities have been estimated for approximately 250 early B-type stars in the Large (LMC) and Small (SMC) Magellanic Clouds from high-resolution VLT-FLAMES data using the non-LTE TLUSTY model atmosphere code. This data set has been supplemented with our previous analyses of some 50 O-type stars (Mokiem et al. 2006, 2007) and 100 narrow-lined early B-type stars (Hunter et al. 2006; Trundle et al. 2007) from the same survey, providing a sample of ~400 early-type objects. Results: Comparison of the rotational velocities with evolutionary tracks suggests that the end of core hydrogen burning occurs later than currently predicted and we argue for an extension of the evolutionary tracks. We also show that the large number of the luminous blue supergiants observed in the fields are unlikely to have directly evolved from main-sequence massive O-type stars as neither their low rotational velocities nor their position on the H-R diagram are predicted. We suggest that blue loops or mass-transfer binary systems may populate the blue supergiant regime. By comparing the rotational velocity distributions of the Magellanic Cloud stars to a similar Galactic sample, we find that (at 3s confidence level) massive stars (above 8 M?) in the SMC rotate faster than those in the solar neighbourhood. However there appears to be no significant difference between the rotational velocity distributions in the Galaxy and the LMC. We find that the v sin i distributions in the SMC and LMC can modelled with an intrinsic rotational velocity distribution that is a Gaussian peaking at 175 km s-1 (SMC) and 100 km s-1 (LMC) with a 1/e half width of 150 km s-1. We find that in NGC 346 in the SMC, the 10-25 M? main-sequence stars appear to rotate faster than their higher mass counterparts. It is not expected that O-type stars spin down significantly through angular momentum loss via stellar winds at SMC metallicity, hence this could be a reflection of mass dependent birth spin rates. Recently Yoon et al. (2006) have determined rates of GRBs by modelling rapidly rotating massive star progenitors. Our measured rotational velocity distribution for the 10-25 M? stars is peaked at slightly higher velocities than they assume, supporting the idea that GRBs could come from rapid rotators with initial masses as low as 14 M? at low metallicities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present experimental results on benchmark problems in 3D cubic lattice structures with the Miyazawa-Jernigan energy function for two local search procedures that utilise the pull-move set: (i) population-based local search (PLS) that traverses the energy landscape with greedy steps towards (potential) local minima followed by upward steps up to a certain level of the objective function; (ii) simulated annealing with a logarithmic cooling schedule (LSA). The parameter settings for PLS are derived from short LSA-runs executed in pre-processing and the procedure utilises tabu lists generated for each member of the population. In terms of the total number of energy function evaluations both methods perform equally well, however. PLS has the potential of being parallelised with an expected speed-up in the region of the population size. Furthermore, both methods require a significant smaller number of function evaluations when compared to Monte Carlo simulations with kink-jump moves. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Margins are used in radiotherapy to assist in the calculation of planning target volumes. These margins can be determined by analysing the geometric uncertainties inherent to the radiotherapy planning and delivery process. An important part of this process is the study of electronic portal images collected throughout the course of treatment. Set-up uncertainties were determined for prostate radiotherapy treatments at our previous site and the new purpose-built centre, with margins determined using a number of different methods. In addition, the potential effect of reducing the action level from 5 mm to 3 mm for changing a patient set-up, based on off-line bony anatomy-based portal image analysis, was studied. Margins generated using different methodologies were comparable. It was found that set-up errors were reduced following relocation to the new centre. Although a significant increase in the number of corrections to a patient's set-up was predicted if the action level was reduced from 5 mm to 3 mm, minimal reduction in patient set-up uncertainties would be seen as a consequence. Prescriptive geometric uncertainty analysis not only supports calculation and justification of the margins used clinically to generate planning target volumes, but may also best be used to monitor trends in clinical practice or audit changes introduced by new equipment, technology or practice. Simulations on existing data showed that a 3 mm rather than a 5 mm action level during off-line, bony anatomy-based portal imaging would have had a minimal benefit for the patients studied in this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complexity is conventionally defined as the level of detail or intricacy contained within a picture. The study of complexity has received relatively little attention-in part, because of the absence of an acceptable metric. Traditionally, normative ratings of complexity have been based on human judgments. However, this study demonstrates that published norms for visual complexity are biased. Familiarity and learning influence the subjective complexity scores for nonsense shapes, with a significant training x familiarity interaction [F(1,52) = 17.53, p <.05]. Several image-processing techniques were explored as alternative measures of picture and image complexity. A perimeter detection measure correlates strongly with human judgments of the complexity of line drawings of real-world objects and nonsense shapes and captures some of the processes important in judgments of subjective complexity, while removing the bias due to familiarity effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article investigates the relationships between technological regimes and firm-level productivity performance, and it explores how such a relationship differs in different Schumpeterian patterns of innovation. The analysis makes use of a rich dataset containing data on innovation and other economic characteristics of a large representative sample of Norwegian firms in manufacturing and service industries for the period 1998–2004. First, we decompose TFP growth into technical progress and efficiency changes by means of data envelopment analysis. We then estimate an empirical model that relates these two productivity components to the characteristics of technological regimes and a set of other firm-specific factors. The results indicate that: (i) TFP growth has mainly been achieved through technical progress, while technical efficiency has on average decreased; (ii) the characteristics of technological regimes are important determinants of firm-level productivity growth, but their impacts on technical progress are different from the effects on efficiency change; (iii) the estimated model works differently in the two Schumpeterian regimes. Technical progress has been more dynamic in Schumpeter Mark II industries, while efficiency change has been more important in Schumpeter Mark I markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present the theoretical framework for the solution of the time-dependent Schrödinger equation (TDSE) of atomic and molecular systems under strong electromagnetic fields with the configuration space of the electron’s coordinates separated over two regions; that is, regions I and II. In region I the solution of the TDSE is obtained by an R-matrix basis set representation of the time-dependent wave function. In region II a grid representation of the wave function is considered and propagation in space and time is obtained through the finite-difference method. With this, a combination of basis set and grid methods is put forward for tackling multiregion time-dependent problems. In both regions, a high-order explicit scheme is employed for the time propagation. While, in a purely hydrogenic system no approximation is involved due to this separation, in multielectron systems the validity and the usefulness of the present method relies on the basic assumption of R-matrix theory, namely, that beyond a certain distance (encompassing region I) a single ejected electron is distinguishable from the other electrons of the multielectron system and evolves there (region II) effectively as a one-electron system. The method is developed in detail for single active electron systems and applied to the exemplar case of the hydrogen atom in an intense laser field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The long-term effects of adjuvant polychemotherapy regimens in oestrogen-receptor-poor (ER-poor) breast cancer, and the extent to which these effects are modified by age or tamoxifen use, can be assessed by an updated meta-analysis of individual patient data from randomised trials. Methods: Collaborative meta-analyses of individual patient data for about 6000 women with ER-poor breast cancer in 46 trials of polychemotherapy versus not (non-taxane-based polychemotherapy, typically about six cycles; trial start dates 1975-96, median 1984) and about 14 000 women with ER-poor breast cancer in 50 trials of tamoxifen versus not (some trials in the presence and some in the absence of polychemotherapy; trial start dates 1972-93, median 1982). Findings: In women with ER-poor breast cancer, polychemotherapy significantly reduced recurrence, breast cancer mortality, and death from any cause, in those younger than 50 years and those aged 50-69 years at entry into trials of polychemotherapy versus not. In those aged younger than 50 years (1907 women, 15% node-positive), the 10-year risks were: recurrence 33% versus 45% (ratio of 10-year risks 0·73, 2p<0·00001), breast cancer mortality 24% versus 32% (ratio 0·73, 2p=0·0002), and death from any cause 25% versus 33% (ratio 0·75, 2p=0·0003). In women aged 50-69 years (3965 women, 58% node-positive), the 10-year risks were: recurrence 42% versus 52% (ratio 0·82, 2p<0·00001), breast cancer mortality 36% versus 42% (ratio 0·86, 2p=0·0004), and death from any cause 39% versus 45% (ratio 0·87, 2p=0·0009). Few were aged 70 years or older. Tamoxifen had little effect on recurrence or death in women who were classified in these trials as having ER-poor disease, and did not significantly modify the effects of polychemotherapy. Interpretation: In women who had ER-poor breast cancer, and were either younger than 50 years or between 50 and 69 years, these older adjuvant polychemotherapy regimens were safe (ie, had little effect on mortality from causes other than breast cancer) and produced substantial and definite reductions in the 10-year risks of recurrence and death. Current and future chemotherapy regimens could well yield larger proportional reductions in breast cancer mortality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A suite of 10 online virtual patients developed using the IVIMEDS ‘Riverside’ authoring tool has been introduced into our undergraduate general practice clerkship. These cases provide a multimedia-rich experience to students. Their interactive nature promotes the development of clinical reasoning skills such as discriminating key clinical features, integrating information from a variety of sources and forming diagnoses and management plans.

Aims: To evaluate the usefulness and usability of a set of online virtual patients in an undergraduate general practice clerkship.
Method: Online questionnaire completed by students after their general practice placement incorporating the System Usability Scale questionnaire.

Results: There was a 57% response rate. Ninety-five per cent of students agreed that the online package was a useful learning tool and ranked virtual patients third out of six learning modalities. Questions and answers and the use of images and videos were all rated highly by students as useful learning methods. The package was perceived to have a high level of usability among respondents.

Conclusion: Feedback from students suggest that this implementation of virtual patients, set in primary care, is user friendly and rated as a valuable adjunct to their learning. The cost of production of such learning resources demands close attention to design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose
This study was designed to investigate methods to help patients suffering from unilateral tinnitus synthesizing an auditory replica of their tinnitus.

Materials and methods
Two semi-automatic methods (A and B) derived from the auditory threshold of the patient and a method (C) combining a pure tone and a narrow band-pass noise centred on an adjustable frequency were devised and rated on their likeness over two test sessions. A third test evaluated the stability over time of the synthesized tinnitus replica built with method C, and its proneness to merge with the patient's tinnitus. Patients were then asked to try and control the lateralisation of this single percept through the adjustment of the tinnitus replica level.

Results
The first two tests showed that seven out of ten patients chose the tinnitus replica built with method C as their preferred one. The third test, performed on twelve patients, revealed pitch tuning was rather stable over a week interval. It showed that eight patients were able to consistently match the central frequency of the synthesized tinnitus (presented to the contralateral ear) to their own tinnitus, which leaded to a unique tinnitus percept. The lateralisation displacement was consistent across patients and revealed an average range of 29dB to obtain a full lateral shift from the ipsilateral to the contralateral side.

Conclusions
Although spectrally simpler than the semi-automatic methods, method C could replicate patients' tinnitus, to some extent. When a unique percept between synthesized tinnitus and patients' tinnitus arose, lateralisation of this percept was achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuing interest in semiconductor photochemistry, SPC, and the emergence of commercial products that utilise films of photocatalyst materials, has created an urgent need to agree a set of methods for assessing photocatalytic activity and international committees are now meeting to address this issue. This article provides a brief overview of two of the most popular current methods employed by researchers for assessing SPC activity. and one which has been published just recently and might gain popularity in the future, given its ease of use. These tests are: the stearic acid (SA) test, the methylene blue (MB) test and the resazurin (Rz) ink test, respectively. The basic photochemical and chemical processes that underpin each of these tests are described, along with typical results for laboratory made sol-gel titania films and a commercial form of self-cleaning glass, Activ (TM). The pros and cons of their future use as possible standard assessment techniques are considered. (C) 2007 Elsevier B.V. All rights reserved.