965 resultados para 2-adic complexity
Resumo:
The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process.
Resumo:
Abstract The current study reports original vapour-liquid equilibrium (VLE) for the system {CO2 (1) + 1-chloropropane (2)}. The measurements have been performed over the entire pressure-composition range for the (303.15, 313.15 and 328.15) K isotherms. The values obtained have been used for comparison of four predictive approaches, namely the equation of state (EoS) of Peng and Robinson (PR), the Soave modification of Benedict–Webb–Rubin (SBWR) EoS, the Critical Point-based Revised Perturbed-Chain Association Fluid Theory (CP-PC-SAFT) EoS, and the Conductor-like Screening Model for Real Solvents (COSMO-RS). It has been demonstrated that the three EoS under consideration yield similar and qualitatively accurate predictions of VLE, which is not the case for the COSMO-RS model examined. Although CP-PC-SAFT EoS exhibits only minor superiority in comparison with PR and SBWR EoS in predicting VLE in the system under consideration, its relative complexity can be justified when taking into account the entire thermodynamic phase space and, in particular, considering the liquid densities and sound velocities over a wider pressure-volume-temperature range.
Resumo:
Multicarrier Index Keying (MCIK) is a recently developed technique that modulates subcarriers but also indices of the subcarriers. In this paper a novel low-complexity detection scheme of subcarrier indices is proposed for an MCIK system and addresses a substantial reduction in complexity over the optimalmaximum likelihood (ML) detection. For the performance evaluation, a closed-form expression for the pairwise error probability (PEP) of an active subcarrier index, and a tight approximation of the average PEP of multiple subcarrier indices are derived in closed-form. The theoretical outcomes are validated usingsimulations, at a difference of less than 0.1dB. Compared to the optimal ML, the proposed detection achieves a substantial reduction in complexity with small loss in error performance (<= 0.6dB).
Resumo:
OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.
METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.
RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).
CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.
ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.
Resumo:
Background Complex medication regimens may adversely affect compliance and treatment outcomes. Complexity can be assessed with the medication regimen complexity index (MRCI), which has proved to be a valid, reliable tool, with potential uses in both practice and research. Objective To use the MRCI to assess medication regimen complexity in institutionalized elderly people. Setting Five nursing homes in mainland Portugal. Methods A descriptive, cross-sectional study of institutionalized elderly people (n = 415) was performed from March to June 2009, including all inpatients aged 65 and over taking at least one medication per day. Main outcome measure Medication regimen complexity index. Results The mean age of the sample was 83.9 years (±6.6 years), and 60.2 % were women. The elderly patients were taking a large number of drugs, with 76.6 % taking more than five medications per day. The average medication regimen complexity was 18.2 (±SD = 9.6), and was higher in the females (p < 0.001). The most decisive factors contributing to the complexity were the number of drugs and dosage frequency. In regimens with the same number of medications, schedule was the most relevant factor in the final score (r = 0.922), followed by pharmaceutical forms (r = 0.768) and additional instructions (r = 0.742). Conclusion Medication regimen complexity proved to be high. There is certainly potential for the pharmacist’s intervention to reduce it as part as the medication review routine in all the patients.
Resumo:
This paper investigates applications of complexity theory in the social sphere and considers its potential contribution to enhance understanding of tourism policy making. Five concepts are identified to explore complex social circumstances and human interactions that influence policy. Social applications of complexity suggest a move towards localised and deeper studies to explore the dynamics of policy enactment in context. It suggests complexity theory might be used as a thinking tool to enable a more holistic approach to policy analysis and investigate policy in its context, considering interactions between different policies/programmes, and the implications of human agency.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
Formaldehyde is a toxic component that is present in foundry resins. Its quantification is important to the characterisation of the resin (kind and degradation) as well as for the evaluation of free contaminants present in wastes generated by the foundry industry. The complexity of the matrices considered suggests the need for separative techniques. The method developed for the identification and quantification of formaldehyde in foundry resins is based on the determination of free carbonyl compounds by derivatization with 2,4-dinitrophenylhydrazine (DNPH), being adapted to the considered matrices using liquid chromatography (LC) with UV detection. Formaldehyde determinations in several foundry resins gave precise results. Mean recovery and R.S.D. were, respectively, >95 and 5%. Analyses by the hydroxylamine reference method gave comparable results. Results showed that hydroxylamine reference method is applicable just for a specific kind of resin, while the developed method has good performance for all studied resins.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology by Universidade Nova de Lisboa, Instituto de Tecnologia Química e Biológica, Instituto Gulbenkian de Ciência.
Resumo:
The first example of a [5+2] cycloaddition reaction wherein the olefin of the vinylcyclopropyl moiety is constrained in a carbocycle was explored, and possible reasons on the lack of reactivity of the substrate were studied. A simple model substrate was synthesized and subjected to cycloaddition conditions to determine if the reason for the lack of reactivity was related to the complexity of the substrate, or if the lack of “conjugative character” of the cyclopropyl ring with respect to the olefin is responsible. A more complex bicyclic substrate possessing an angular methyl group at the ring junction was also synthesized and explored, with evidence supporting the current theory of deconjugation of the cyclopropyl moiety.
Resumo:
Dans ce mémoire, je démontre que la distribution de probabilités de l'état quantique Greenberger-Horne-Zeilinger (GHZ) sous l'action locale de mesures de von Neumann indépendantes sur chaque qubit suit une distribution qui est une combinaison convexe de deux distributions. Les coefficients de la combinaison sont reliés aux parties équatoriales des mesures et les distributions associées à ces coefficients sont reliées aux parties réelles des mesures. Une application possible du résultat est qu'il permet de scinder en deux la simulation de l'état GHZ. Simuler, en pire cas ou en moyenne, un état quantique comme GHZ avec des ressources aléatoires, partagées ou privées, et des ressources classiques de communication, ou même des ressources fantaisistes comme les boîtes non locales, est un problème important en complexité de la communication quantique. On peut penser à ce problème de simulation comme un problème où plusieurs personnes obtiennent chacune une mesure de von Neumann à appliquer sur le sous-système de l'état GHZ qu'il partage avec les autres personnes. Chaque personne ne connaît que les données décrivant sa mesure et d'aucune façon une personne ne connaît les données décrivant la mesure d'une autre personne. Chaque personne obtient un résultat aléatoire classique. La distribution conjointe de ces résultats aléatoires classiques suit la distribution de probabilités trouvée dans ce mémoire. Le but est de simuler classiquement la distribution de probabilités de l'état GHZ. Mon résultat indique une marche à suivre qui consiste d'abord à simuler les parties équatoriales des mesures pour pouvoir ensuite savoir laquelle des distributions associées aux parties réelles des mesures il faut simuler. D'autres chercheurs ont trouvé comment simuler les parties équatoriales des mesures de von Neumann avec de la communication classique dans le cas de 3 personnes, mais la simulation des parties réelles résiste encore et toujours.
Resumo:
Analysis by reduction is a method used in linguistics for checking the correctness of sentences of natural languages. This method is modelled by restarting automata. Here we study a new type of restarting automaton, the so-called t-sRL-automaton, which is an RL-automaton that is rather restricted in that it has a window of size 1 only, and that it works under a minimal acceptance condition. On the other hand, it is allowed to perform up to t rewrite (that is, delete) steps per cycle. We focus on the descriptional complexity of these automata, establishing two complexity measures that are both based on the description of t-sRL-automata in terms of so-called meta-instructions. We present some hierarchy results as well as a non-recursive trade-off between deterministic 2-sRL-automata and finite-state acceptors.
Resumo:
1. Although the importance of plant community assemblages in structuring invertebrate assemblages is well known, the role that architectural complexity plays is less well understood. In particular, direct empirical data for a range of invertebrate taxa showing how functional groups respond to plant architecture is largely absent from the literature. 2. The significance of sward architectural complexity in determining the species richness of predatory and phytophagous functional groups of spiders, beetles, and true bugs, sampled from 135 field margin plots over 2 years was tested. The present study compares the relative importance of sward architectural complexity to that of plant community assemblage. 3. Sward architectural complexity was found to be a determinant of species richness for all phytophagous and predatory functional groups. When individual species responses were investigated, 62.5% of the spider and beetle species, and 50.0% of the true bugs responded to sward architectural complexity. 4. Interactions between sward architectural complexity and plant community assemblage indicate that the number of invertebrate species supported by the plant community alone could be increased by modification of sward architecture. Management practices could therefore play a key role in diversifying the architectural structure of existing floral assemblages for the benefit of invertebrate assemblages. 5. The contrasting effects of sward architecture on invertebrate functional groups characterised by either direct (phytophagous species) or indirect (predatory species) dependence on plant communities is discussed. It is suggested that for phytophagous taxa, plant community assemblage alone is likely to be insufficient to ensure successful species colonisation or persistence without appropriate development of sward architecture.
Resumo:
This article describes two studies. The first study was designed to investigate the ways in which the statutory assessments of reading for 11-year-old children in England assess inferential abilities. The second study was designed to investigate the levels of performance achieved in these tests in 2001 and 2002 by 11-year-old children attending state-funded local authority schools in one London borough. In the first study, content and questions used in the reading papers for the Standard Assessment Tasks (SATs) in the years 2001 and 2002 were analysed to see what types of inference were being assessed. This analysis suggested that the complexity involved in inference making and the variety of inference types that are made during the reading process are not adequately sampled in the SATs. Similar inadequacies are evident in the ways in which the programmes of study for literacy recommended by central government deal with inference. In the second study, scripts of completed SATs reading papers for 2001 and 2002 were analysed to investigate the levels of inferential ability evident in scripts of children achieving different SATs levels. The analysis in this article suggests that children who only just achieve the 'target' Level 4 do so with minimal use of inference skills. They are particularly weak in making inferences that require the application of background knowledge. Thus, many children who achieve the reading level (Level 4) expected of 11-year-olds are entering secondary education with insecure inference-making skills that have not been recognised.