235 resultados para analogy calculation
Resumo:
Background: Hydroxyurea (HU), an inhibitor of ribonucleotide reductase, may potentiate the activity of 5-fluorouracil (5-FU) and folinic acid (FA) by reducing the deoxyribonucleotide pool available for DNA synthesis and repair. However as HU may inhibit the formation of 5-fluoro-2-deoxyuridine-5- monophosphate (FdUMP), one of the principal active metabolites of 5-FU, the scheduling of HU may be critical. In vitro experiments suggest that administration of HU following 5-FU, maintaining the concentration in the region of I mM for six or more hours, significantly enhances the efficacy of 5-FU. Patients and methods: 5-FU/FA was given as follows: days 1 and 2 - FA 250 mg/m 2 (max. 350 mg) over two hours followed by 5-FU 400 mg/m 2 by intravenous bolus (ivb) over 15 minutes and subsequently 5-FU 400 mg/m 2 infusion (ivi) over 22 hours. HU was administered on day 3 immediately after the 5-FU with 3 g ivb over 15 minutes followed by 12 g ivi over 12 hours. Results: Thirty patients were entered into the study. Median survival was nine months (range 1-51 + months). There were eight partial responses (28%, 95% CI: 13%-47%). The median duration of response was 6.5 (range 4-9 months). Grade 3-4 toxicities included neutropenia (grade 3 in eight patients and grade 4 in five), anaemia (grade 3 in one patient) and diarrhoea (grade 3 in two patients). Neutropenia was associated with pyrexia in two patients. Phlebitis at the infusion site occurred in five patients. The treatment was complicated by pulmonary embolism in one patient and deep venous thrombosis in another. Conclusion: HU administered in this schedule is well tolerated. Based on these results and those of other phase II studies, a randomised phase III study of 5-FU, FA and HU versus 5-FU and FA using the standard de Gramont schedule is recommended.
Resumo:
Stations on Bus Rapid Transit (BRT) lines ordinarily control line capacity because they act as bottlenecks. At stations with passing lanes, congestion may occur when buses maneuvering into and out of the platform stopping lane interfere with bus flow, or when a queue of buses forms upstream of the station blocking inflow. We contend that, as bus inflow to the station area approaches capacity, queuing will become excessive in a manner similar to operation of a minor movement on an unsignalized intersection. This analogy is used to treat BRT station operation and to analyze the relationship between station queuing and capacity. In the first of three stages, we conducted microscopic simulation modeling to study and analyze operating characteristics of the station under near steady state conditions through output variables of capacity, degree of saturation and queuing. A mathematical model was then developed to estimate the relationship between average queue and degree of saturation and calibrated for a specified range of controlled scenarios of mean and coefficient of variation of dwell time. Finally, simulation results were calibrated and validated.
Resumo:
The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.
Resumo:
This paper describes a new method of indexing and searching large binary signature collections to efficiently find similar signatures, addressing the scalability problem in signature search. Signatures offer efficient computation with acceptable measure of similarity in numerous applications. However, performing a complete search with a given search argument (a signature) requires a Hamming distance calculation against every signature in the collection. This quickly becomes excessive when dealing with large collections, presenting issues of scalability that limit their applicability. Our method efficiently finds similar signatures in very large collections, trading memory use and precision for greatly improved search speed. Experimental results demonstrate that our approach is capable of finding a set of nearest signatures to a given search argument with a high degree of speed and fidelity.
Resumo:
There are currently more than 400 cities operating bike share programs. Purported benefits of bike share programs include flexible mobility, physical activity, reduced congestion, emissions and fuel use. Implicit or explicit in the calculation of program benefits are assumptions regarding the modes of travel replaced by bike share journeys. This paper examines the degree to which car trips are replaced by bike share, through an examination of survey and trip data from bike share programs in Melbourne, Brisbane, Washing, D.C., London, and Minneapolis/St. Paul. A secondary and unique component of this analysis examines motor vehicle support services required for bike share fleet rebalancing and maintenance. These two components are then combined to estimate bike share’s overall contribution to changes in vehicle kilometres traveled. The results indicate that the estimated mean reduction in car use due to bike share is at least twice the distance covered by operator support vehicles, with the exception of London, in which the relationship is reversed, largely due to a low car mode substitution rate. As bike share programs mature, evaluation of their effectiveness in reducing car use may become increasingly important. This paper reveals that by increasing the convenience of bike share relative to car use and by improving perceptions of safety, the capacity of bike share programs to reduce vehicle trips and yield overall net benefits will be enhanced. Researchers can adapt the analytical approach proposed in this paper to assist in the evaluation of current and future bike share programs.
Resumo:
Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.
Resumo:
Executive Summary Emergency Departments (EDs) locally, nationally and internationally are becoming increasingly busy. Within this context, it can be challenging to deliver a health service that is safe, of high quality and cost-effective. Whilst various models are described within the literature that aim to measure ED ‘work’ or ‘activity’, they are often not linked to a measure of costs to provide such activity. It is important for hospital and ED managers to understand and apply this link so that optimal staffing and financial resourcing can be justifiably sought. This research is timely given that Australia has moved towards a national Activity Based Funding (ABF) model for ED activity. ABF is believed to increase transparency of care and fairness (i.e. equal work receives equal pay). ABF involves a person-, performance- or activity-based payment system, and thus a move away from historical “block payment” models that do not incentivise efficiency and quality. The aim of the Statewide Workforce and Activity-Based Funding Modelling Project in Queensland Emergency Departments (SWAMPED) is to identify and describe best practice Emergency Department (ED) workforce models within the current context of ED funding that operates under an ABF model. The study is comprised of five distinct phases. This monograph (Phase 1) comprises a systematic review of the literature that was completed in June 2013. The remaining phases include a detailed survey of Queensland hospital EDs’ resource levels, activity and operational models of care, development of new resource models, development of a user-friendly modelling interface for ED mangers, and production of a final report that identifies policy implications. The anticipated deliverable outcome of this research is the development of an ABF based Emergency Workforce Modelling Tool that will enable ED managers to profile both their workforce and operational models of care. Additionally, the tool will assist with the ability to more accurately inform adequate staffing numbers required in the future, inform planning of expected expenditures and be used for standardisation and benchmarking across similar EDs. Summary of the Findings Within the remit of this review of the literature, the main findings include: 1. EDs are becoming busier and more congested Rising demand, barriers to ED throughput and transitions of care all contribute to ED congestion. In addition requests by organisational managers and the community require continued broadening of the scope of services required of the ED and further increases in demand. As the population live longer with more lifestyle diseases their propensity to require ED care continues to grow. 2. Various models of care within EDs exist Models often vary to account for site specific characteritics to suit staffing profile, ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Existing and new models implemented within EDs often depend on the target outcome requiring change. Generally this is focussed on addressing issues at the input, throughput or output areas of the ED. Even with models targeting similar demographic or illness, the structure and process elements underpinning the model can vary, which can impact on outcomes and variance to the patient and carer experience between and within EDs. Major models of care to manage throughput inefficiencies include: A. Workforce Models of Care focus on the appropriate level of staffing for a given workload to provide prompt, timely and clinically effective patient care within an emergency care setting. The studies reviewed suggest that the early involvement of senior medical decision maker and/or specialised nursing roles such as Emergency Nurse Practitioners and Clinical Initiatives Nurse, primary contact or extended scope Allied Health Practitioners can facilitate patient flow and improve key indicators such as length of stay and reducing the number of those who did not wait to be seen amongst others. B. Operational Models of Care within EDs focus on mechanisms for streaming (e.g. fast-tracking) or otherwise grouping patient care based on acuity and complexity to assist with minimising any throughput inefficiencies. While studies support the positive impact of these models in general, it appears that they are most effective when they are adequately resourced. 3. Various methods of measuring ED activity exist Measuring ED activity requires careful consideration of models of care and staffing profile. Measuring activity requires the ability to account for factors including: patient census, acuity, LOS, intensity of intervention, department skill-mix plus an adjustment for non-patient care time. 4. Gaps in the literature Continued ED growth calls for new and innovative care delivery models that are safe, clinically effective and cost effective. New roles and stand-alone service delivery models are often evaluated in isolation without considering the global and economic impact on staffing profiles. Whilst various models of accounting for and measuring health care activity exist, costing studies and cost effectiveness studies are lacking for EDs making accurate and reliable assessments of care models difficult. There is a necessity to further understand, refine and account for measures of ED complexity that define a workload upon which resources and appropriate staffing determinations can be made into the future. There is also a need for continued monitoring and comprehensive evaluation of newly implemented workforce modelling tools. This research acknowledges those gaps and aims to: • Undertake a comprehensive and integrated whole of department workforce profiling exercise relative to resources in the context of ABF. • Inform workforce requirements based on traditional quantitative markers (e.g. volume and acuity) combined with qualitative elements of ED models of care; • Develop a comprehensive and validated workforce calculation tool that can be used to better inform or at least guide workforce requirements in a more transparent manner.
Resumo:
Much of what we currently understand about the structure and energetics of multiply charged anions in the gas phase is derived from the measurement of photoelectron spectra of simple dicarboxylate dianions. Here we have employed a modified linear ion-trap mass spectrometer to undertake complementary investigations of the ionic products resulting from laser-initiated electron photodetachment of two model dianions. Electron photodetachment (ePD) of the \[M-2H](2-) dianions formed from glutaric and adipic acid were found to result in a significant loss of ion signal overall, which is consistent with photoelectron studies that report the emission of slow secondary electrons (Xing et al., 2010 \[201). The ePD mass spectra reveal no signals corresponding to the intact \[M-2H](center dot-) radical anions, but rather \[M-2H-CO2](center dot-) ions are identified as the only abundant ionic products indicating that spontaneous decarboxylation follows ejection of the first electron. Interestingly however, investigations of the structure and energetics of the \[M-2H-CO2](center dot-) photoproducts by ion-molecule reaction and electronic structure calculation indicate that (i) these ions are stable with respect to secondary electron detachment and (ii) most of the ion population retains a distonic radical anion structure where the radical remains localised at the position of the departed carboxylate moiety. These observations lead to the conclusion that the mechanism for loss of ion signal involves unimolecular rearrangement reactions of the nascent \[M-2H](center dot-) carbonyloxyl radical anions that compete favourably with direct decarboxylation. Several possible rearrangement pathways that facilitate electron detachment from the radical anion are identified and are computed to be energetically accessible. Such pathways provide an explanation for prior observations of slow secondary electron features in the photoelectron spectra of the same dicaboxylate dianions. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.
Resumo:
We describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model. Our construction works in groups equipped with an efficient bilinear map, or, more generally, an algorithm for the Decision Diffie-Hellman problem. The security of our scheme depends on a new intractability assumption we call Strong Diffie-Hellman (SDH), by analogy to the Strong RSA assumption with which it shares many properties. Signature generation in our system is fast and the resulting signatures are as short as DSA signatures for comparable security. We give a tight reduction proving that our scheme is secure in any group in which the SDH assumption holds, without relying on the random oracle model.
Resumo:
In this paper we introduce a formalization of Logical Imaging applied to IR in terms of Quantum Theory through the use of an analogy between states of a quantum system and terms in text documents. Our formalization relies upon the Schrodinger Picture, creating an analogy between the dynamics of a physical system and the kinematics of probabilities generated by Logical Imaging. By using Quantum Theory, it is possible to model more precisely contextual information in a seamless and principled fashion within the Logical Imaging process. While further work is needed to empirically validate this, the foundations for doing so are provided.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
Boolean functions and their Möbius transforms are involved in logical calculation, digital communications, coding theory and modern cryptography. So far, little is known about the relations of Boolean functions and their Möbius transforms. This work is composed of three parts. In the first part, we present relations between a Boolean function and its Möbius transform so as to convert the truth table/algebraic normal form (ANF) to the ANF/truth table of a function in different conditions. In the second part, we focus on the special case when a Boolean function is identical to its Möbius transform. We call such functions coincident. In the third part, we generalize the concept of coincident functions and indicate that any Boolean function has the coincidence property even it is not coincident.
Resumo:
The ability to understand and predict how thermal, hydrological,mechanical and chemical (THMC) processes interact is fundamental to many research initiatives and industrial applications. We present (1) a new Thermal– Hydrological–Mechanical–Chemical (THMC) coupling formulation, based on non-equilibrium thermodynamics; (2) show how THMC feedback is incorporated in the thermodynamic approach; (3) suggest a unifying thermodynamic framework for multi-scaling; and (4) formulate a new rationale for assessing upper and lower bounds of dissipation for THMC processes. The technique is based on deducing time and length scales suitable for separating processes using a macroscopic finite time thermodynamic approach. We show that if the time and length scales are suitably chosen, the calculation of entropic bounds can be used to describe three different types of material and process uncertainties: geometric uncertainties,stemming from the microstructure; process uncertainty, stemming from the correct derivation of the constitutive behavior; and uncertainties in time evolution, stemming from the path dependence of the time integration of the irreversible entropy production. Although the approach is specifically formulated here for THMC coupling we suggest that it has a much broader applicability. In a general sense it consists of finding the entropic bounds of the dissipation defined by the product of thermodynamic force times thermodynamic flux which in material sciences corresponds to generalized stress and generalized strain rates, respectively.
Resumo:
In order to assist with the development of more selective and sensitive methods for thyroid hormone analysis the \[M-H](-) anions of the iodothyronines T4, T3, rT3, (3,5)-T2 and the non-iodinated thyronine (TO) have been generated by negative ion electrospray mass spectrometry. Tandem mass spectra of these ions were recorded on a triple-quadrupole mass spectrometer and show a strong analogy with the fragmentation pathways of the parent compound, tyrosine. All iodothyronines also show significant abundances of the iodide anion in their tandem mass spectra, which represents an attractive target for multiple reaction monitoring (MRM) analysis, given that iodothyronines are the only iodine bearing endogenous molecules. Characteristic fragments are observed at m/z 359.7 and 604.5 for rT3 but are absent in the spectrum of T3, thus differentiating the two positional isomers. The striking difference in the fragmentation patterns of these regioisomeric species is attributed to the increased acidity of the phenol moiety in rT3 compared with T3. Copyright (C) 2005 John Wiley & Sons, Ltd.