897 resultados para Transfusion threshold


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs, and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 mm to 100 mm, using a nominal photon energy of 6 MV. Results According to the practical definition established in this project, field sizes < 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0 % to 2.0 %, or field size uncertainties are 0.5 mm, field sizes < 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes < 12 mm. Source occlusion also caused a large change in OPF for field sizes < 8 mm. Based on the results of this study, field sizes < 12 mm were considered to be theoretically very small for 6 MV beams. Conclusions Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least < 12 mm and more conservatively < 15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shared Material on Dying is a trio/solo work commissioned by Jenny Roche and the Dublin Dance Festival in 2008 from choreographer Liz Roche. Touring widely since its creation, it continues to be a rich research environment for the interrogation by Jenny Roche of the dancer’s first-person perspective in choreographic production and performance. The research perspective drawn from the live performance of this iteration was how the exploration of the same dancing moment might be expressed from multiple perspectives by three different dancers and what this might reveal about the dancer’s inner configuring of the performance environment. Erin Manning (2009) describes the unstable and emergent moment before movement materialises as the preacceleration of the movement, when the potentialities of the gesture collapse and stabilize into form. It is this threshold of potentiality that is interesting, the moment before the dance happens when the configuring process of the dancer brings it into being. Cynthia Roses Thema (2007), after neuroscientist Antonio Damasio, writes that embodied experience is mapped as it unfolds and alters from moment to moment in line with a constantly changing internal milieu. As a performer in this piece, I explored the inner terrain of the three performers (myself included) by externalizing these inner states. This research contributed to a paper presentation at the Digital Resources for the Arts and Humanities Conference, UK 2013.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliability of carrier phase ambiguity resolution (AR) of an integer least-squares (ILS) problem depends on ambiguity success rate (ASR), which in practice can be well approximated by the success probability of integer bootstrapping solutions. With the current GPS constellation, sufficiently high ASR of geometry-based model can only be achievable at certain percentage of time. As a result, high reliability of AR cannot be assured by the single constellation. In the event of dual constellations system (DCS), for example, GPS and Beidou, which provide more satellites in view, users can expect significant performance benefits such as AR reliability and high precision positioning solutions. Simply using all the satellites in view for AR and positioning is a straightforward solution, but does not necessarily lead to high reliability as it is hoped. The paper presents an alternative approach that selects a subset of the visible satellites to achieve a higher reliability performance of the AR solutions in a multi-GNSS environment, instead of using all the satellites. Traditionally, satellite selection algorithms are mostly based on the position dilution of precision (PDOP) in order to meet accuracy requirements. In this contribution, some reliability criteria are introduced for GNSS satellite selection, and a novel satellite selection algorithm for reliable ambiguity resolution (SARA) is developed. The SARA algorithm allows receivers to select a subset of satellites for achieving high ASR such as above 0.99. Numerical results from a simulated dual constellation cases show that with the SARA procedure, the percentages of ASR values in excess of 0.99 and the percentages of ratio-test values passing the threshold 3 are both higher than those directly using all satellites in view, particularly in the case of dual-constellation, the percentages of ASRs (>0.99) and ratio-test values (>3) could be as high as 98.0 and 98.5 % respectively, compared to 18.1 and 25.0 % without satellite selection process. It is also worth noting that the implementation of SARA is simple and the computation time is low, which can be applied in most real-time data processing applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We construct two efficient Identity-Based Encryption (IBE) systems that admit selective-identity security reductions without random oracles in groups equipped with a bilinear map. Selective-identity secure IBE is a slightly weaker security model than the standard security model for IBE. In this model the adversary must commit ahead of time to the identity that it intends to attack, whereas in an adaptive-identity attack the adversary is allowed to choose this identity adaptively. Our first system—BB1—is based on the well studied decisional bilinear Diffie–Hellman assumption, and extends naturally to systems with hierarchical identities, or HIBE. Our second system—BB2—is based on a stronger assumption which we call the Bilinear Diffie–Hellman Inversion assumption and provides another approach to building IBE systems. Our first system, BB1, is very versatile and well suited for practical applications: the basic hierarchical construction can be efficiently secured against chosen-ciphertext attacks, and further extended to support efficient non-interactive threshold decryption, among others, all without using random oracles. Both systems, BB1 and BB2, can be modified generically to provide “full” IBE security (i.e., against adaptive-identity attacks), either using random oracles, or in the standard model at the expense of a non-polynomial but easy-to-compensate security reduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Breast cancer is the cancer that most commonly affects women worldwide. This type of cancer is genetically complex, but is strongly linked to steroid hormone signalling systems. Because microRNAs act as translational regulators of multiple genes, including the steroid nuclear receptors, single nucleotide polymorphisms (SNPs) in microRNAs genes can have potentially wide-ranging influences on breast cancer development. Thus, this study was conducted to investigate the relationships between six SNPs (rs6977848, rs199981120, rs185641358, rs113054794, rs66461782, and rs12940701) located in four miRNA genes predicted to target the estrogen receptor (miR-148a, miR-221, miR-186, and miR-152) and breast cancer risk in Caucasian Australian women. By using high resolution melt analysis (HRM) and polymerase chain reaction- restriction fragment length polymorphism (PCR-RFLP), 487 samples including 225 controls and 262 cases were genotyped. Analysis of their genotype and allele frequencies indicated that the differences between case and control populations was not significant for rs6977848, rs66461782, and rs12940701 because their p-values are 0.81, 0.93, 0.1 which are all above the threshold value (p=0.05). Our data thus suggests that these SNPs do not affect breast cancer risk in the tested population. In addition, rs199981120, rs185641358, and rs113054794 could not be found in this population, suggesting that these SNPs do not occur in Caucasian Australians.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, a convex hull-based human identification protocol was proposed by Sobrado and Birget, whose steps can be performed by humans without additional aid. The main part of the protocol involves the user mentally forming a convex hull of secret icons in a set of graphical icons and then clicking randomly within this convex hull. While some rudimentary security issues of this protocol have been discussed, a comprehensive security analysis has been lacking. In this paper, we analyze the security of this convex hull-based protocol. In particular, we show two probabilistic attacks that reveal the user’s secret after the observation of only a handful of authentication sessions. These attacks can be efficiently implemented as their time and space complexities are considerably less than brute force attack. We show that while the first attack can be mitigated through appropriately chosen values of system parameters, the second attack succeeds with a non-negligible probability even with large system parameter values that cross the threshold of usability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter we continue the exposition of crypto topics that was begun in the previous chapter. This chapter covers secret sharing, threshold cryptography, signature schemes, and finally quantum key distribution and quantum cryptography. As in the previous chapter, we have focused only on the essentials of each topic. We have selected in the bibliography a list of representative items, which can be consulted for further details. First we give a synopsis of the topics that are discussed in this chapter. Secret sharing is concerned with the problem of how to distribute a secret among a group of participating individuals, or entities, so that only predesignated collections of individuals are able to recreate the secret by collectively combining the parts of the secret that were allocated to them. There are numerous applications of secret-sharing schemes in practice. One example of secret sharing occurs in banking. For instance, the combination to a vault may be distributed in such a way that only specified collections of employees can open the vault by pooling their portions of the combination. In this way the authority to initiate an action, e.g., the opening of a bank vault, is divided for the purposes of providing security and for added functionality, such as auditing, if required. Threshold cryptography is a relatively recently studied area of cryptography. It deals with situations where the authority to initiate or perform cryptographic operations is distributed among a group of individuals. Many of the standard operations of single-user cryptography have counterparts in threshold cryptography. Signature schemes deal with the problem of generating and verifying electronic) signatures for documents.Asubclass of signature schemes is concerned with the shared-generation and the sharedverification of signatures, where a collaborating group of individuals are required to perform these actions. A new paradigm of security has recently been introduced into cryptography with the emergence of the ideas of quantum key distribution and quantum cryptography. While classical cryptography employs various mathematical techniques to restrict eavesdroppers from learning the contents of encrypted messages, in quantum cryptography the information is protected by the laws of physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently a convex hull based human identification protocol was proposed by Sobrado and Birget, whose steps can be performed by humans without additional aid. The main part of the protocol involves the user mentally forming a convex hull of secret icons in a set of graphical icons and then clicking randomly within this convex hull. In this paper we show two efficient probabilistic attacks on this protocol which reveal the user’s secret after the observation of only a handful of authentication sessions. We show that while the first attack can be mitigated through appropriately chosen values of system parameters, the second attack succeeds with a non-negligible probability even with large system parameter values which cross the threshold of usability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secure multi-party computation (MPC) protocols enable a set of n mutually distrusting participants P 1, ..., P n , each with their own private input x i , to compute a function Y = F(x 1, ..., x n ), such that at the end of the protocol, all participants learn the correct value of Y, while secrecy of the private inputs is maintained. Classical results in the unconditionally secure MPC indicate that in the presence of an active adversary, every function can be computed if and only if the number of corrupted participants, t a , is smaller than n/3. Relaxing the requirement of perfect secrecy and utilizing broadcast channels, one can improve this bound to t a  < n/2. All existing MPC protocols assume that uncorrupted participants are truly honest, i.e., they are not even curious in learning other participant secret inputs. Based on this assumption, some MPC protocols are designed in such a way that after elimination of all misbehaving participants, the remaining ones learn all information in the system. This is not consistent with maintaining privacy of the participant inputs. Furthermore, an improvement of the classical results given by Fitzi, Hirt, and Maurer indicates that in addition to t a actively corrupted participants, the adversary may simultaneously corrupt some participants passively. This is in contrast to the assumption that participants who are not corrupted by an active adversary are truly honest. This paper examines the privacy of MPC protocols, and introduces the notion of an omnipresent adversary, which cannot be eliminated from the protocol. The omnipresent adversary can be either a passive, an active or a mixed one. We assume that up to a minority of participants who are not corrupted by an active adversary can be corrupted passively, with the restriction that at any time, the number of corrupted participants does not exceed a predetermined threshold. We will also show that the existence of a t-resilient protocol for a group of n participants, implies the existence of a t’-private protocol for a group of n′ participants. That is, the elimination of misbehaving participants from a t-resilient protocol leads to the decomposition of the protocol. Our adversary model stipulates that a MPC protocol never operates with a set of truly honest participants (which is a more realistic scenario). Therefore, privacy of all participants who properly follow the protocol will be maintained. We present a novel disqualification protocol to avoid a loss of privacy of participants who properly follow the protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In moderate to high sea states, the effectiveness of ship fin stabilizers can severely deteriorate due to nonlinear effects arising from unsteady hydrodynamic characteristics of the fins: dynamic stall. These nonlinear effects take the form of a hysteresis, and they become very significant when the effective angle of attack of the fins exceeds a certain threshold angle. Dynamic stall can result in a complete loss of control action depending on how much the fins exceed the threshold angle. When this is detected, it is common to reduce the gain of the controller that commands the fins. This approach is cautious and tends to reduce performance when the conditions leading to dynamic stall disappear. An alternative approach for preventing the effects while keeping high performance, consists of estimating the effective angle of attack and set a conservative constraint on it as part of the control objectives. In this paper, we investigate the latter approach, and propose the use of a model predictive control (MPC) to prevent the development of these nonlinear effects by considering constraints on both the mechanical angle of the fins and the effective angle of attack.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Murine models with modified gene function as a result of N-ethyl-N-nitrosourea (ENU) mutagenesis have been used to study phenotypes resulting from genetic change. This study investigated genetic factors associated with red blood cell (RBC) physiology and structural integrity that may impact on blood component storage and transfusion outcome. Forward and reverse genetic approaches were employed with pedigrees of ENU-treated mice using a homozygous recessive breeding strategy. In a “forward genetic” approach, pedigree selection was based upon identification of an altered phenotype followed by exome sequencing to identify a causative mutation. In a second strategy, a “reverse genetic” approach based on selection of pedigrees with mutations in genes of interest was utilised and, following breeding to homozygosity, phenotype assessed. Thirty-three pedigrees were screened by the forward genetic approach. One pedigree demonstrated reticulocytosis, microcytic anaemia and thrombocytosis. Exome sequencing revealed a novel single nucleotide variation (SNV) in Ank1 encoding the RBC structural protein ankyrin-1 and the pedigree was designated Ank1EX34. The reticulocytosis and microcytic anaemia observed in the Ank1EX34 pedigree were similar to clinical features of hereditary spherocytosis in humans. For the reverse genetic approach three pedigrees with different point mutations in Spnb1 encoding RBC protein spectrin-1β, and one pedigree with a mutation in Epb4.1, encoding band 4.1 were selected for study. When bred to homozygosity two of the spectrin-1β pedigrees (a, b) demonstrated increased RBC count, haemoglobin (Hb) and haematocrit (HCT). The third Spnb1 mutation (spectrin-1β c) and mutation in Epb4.1 (band 4.1) did not significantly affect the haematological phenotype, despite these two mutations having a PolyPhen score predicting the mutation may be damaging. Exome sequencing allows rapid identification of causative mutations and development of databases of mutations predicted to be disruptive. These tools require further refinement but provide new approaches to the study of genetically defined changes that may impact on blood component storage and transfusion outcome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Lumbar Epidural Steroids Injections (ESI’s) have previously been shown to provide some degree of pain relief in sciatica. Number Needed To Treat (NNT) to achieve 50% pain relief has been estimated at 7 from the results of randomised controlled trials. Pain relief is temporary. They remain one of the most commonly provided procedures in the UK. It is unknown whether this pain relief represents good value for money. Methods 228 patients were randomised into a multi-centre Double Blind Randomised Controlled Trial. Subjects received up to 3 ESI’s or intra-spinous saline depending on response and fall off with the first injection. All other treatments were permitted. All received a review of analgesia, education and physical therapy. Quality of life was assessed using the SF36 at 6 points and compared using independent sample t-tests. Follow up was up to 1 yr. Missing data was imputed using last observation carried forward (LOCF). QALY’s (Quality of Life Years) were derived from preference based heath values (summary health utility score). SF-6D health state classification was derived from SF-36 raw score data. Standard gambles (SG) were calculated using Model 10. SG scores were calculated on trial results. LOCF was not used for this. Instead average SG were derived for a subset of patients with observations for all visits up to week 12. Incremental QALY’s were derived as the difference in the area between the SG curve for the active group and placebo group. Results SF36 domains showed a significant improvement in pain at week 3 but this was not sustained (mean 54 Active vs 61 Placebo P<0.05). Other domains did not show any significant gains compared with placebo. For derivation of SG the number in the sample in each period differed. In week 12, average SG scores for active and placebo converged. In other words, the health gain for the active group as measured by SG was achieved by the placebo group by week 12. The incremental QALY gained for a patient under the trial protocol compared with the standard care package was 0.0059350. This is equivalent to an additional 2.2 days of full health. The cost per QALY gained to the provider from a patient management strategy administering one epidural as suggested by results was £25 745.68. This result was derived assuming that the gain in QALY data calculated for patients under the trial protocol would approximate that under a patient management strategy based on the trial results (one ESI). This is above the threshold suggested by some as a cost effective treatment. Conclusions The transient benefit in pain relief afforded by ESI’s does not appear to be cost-effective. Further work is needed to develop more cost-effective conservative treatments for sciatica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Institutional graduate capabilities and discipline threshold learning outcomes require science students to demonstrate ethical conduct and social responsibility. However, neither the teaching nor the assessment of these concepts is straightforward. Australian chemistry academics participated in a workshop in 2013 to discuss and develop teaching and assessment in these areas and this paper reports on the outcomes of that workshop. Controversial issues discussed included: How broad is the mandate of the teacher, how should the boundaries between personal values and ethics be drawn, and how can ethics be assessed without moral judgement? In this position paper, I argue for a deep engagement with ethics and social justice, achieved through case studies and assessed against criteria that require discussion and debate. Strategies to effectively assess science students’ understanding of ethics and social responsibility are detailed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The planning of IMRT treatments requires a compromise between dose conformity (complexity) and deliverability. This study investigates established and novel treatment complexity metrics for 122 IMRT beams from prostate treatment plans. The Treatment and Dose Assessor software was used to extract the necessary data from exported treatment plan files and calculate the metrics. For most of the metrics, there was strong overlap between the calculated values for plans that passed and failed their quality assurance (QA) tests. However, statistically significant variation between plans that passed and failed QA measurements was found for the established modulation index and for a novel metric describing the proportion of small apertures in each beam. The ‘small aperture score’ provided threshold values which successfully distinguished deliverable treatment plans from plans that did not pass QA, with a low false negative rate.