894 resultados para Balanced occlusion
Resumo:
We report on charge transport and density of trap states (trap DOS) in ambipolar diketopyrrolopyrrole-benzothiadiazole copolymer thin-film transistors. This semiconductor possesses high electron and hole field-effect mobilities of up to 0.6 cm 2/V-s. Temperature and gate-bias dependent field-effect mobility measurements are employed to extract the activation energies and trap DOS to understand its unique high mobility balanced ambipolar charge transport properties. The symmetry between the electron and hole transport characteristics, parameters and activation energies is remarkable. We believe that our work is the first charge transport study of an ambipolar organic/polymer based field-effect transistor with room temperature mobility higher than 0.1 cm 2/V-s in both electrons and holes.
Resumo:
Directors of nonprofits in most countries have legal responsibility for monitoring organisational performance (Brody 2010), although there is typically little guidance on how this should occur. The balanced scorecard (BSC) (Kaplan & Norton, 1996, 2001) potentially provides boards with a monitoring tool (Kaplan $ Norton, 2006; Lorsch, 2002). The BSC is intended to help integrate performance measurement, performance management and strategy implmentation (Kaplan 2009). The scorecards is balanced in that it should incorporate both financial and non-financial measures, external and internal perspectives, short and long-term objectives and both lagging and leading indicators. It is a relatively simple tool, but with potentially profound implications for directing board attention and sbusequent action (Ocasio, 1997; Salterio, 2012).
Resumo:
This thesis considers whether the Australian Privacy Commissioner's use of its powers supports compliance with the requirement to 'take reasonable steps' to protect personal information in National Privacy Principle 4 of the Privacy Act 1988 (Cth). Two unique lenses were used. First, the Commissioner's use of powers was assessed against the principles of transparency, balance and vigorousness and secondly against alignment with an industry practice approach to securing information. Following a comprehensive review of publicly available materials, interviews and investigation file records, this thesis found that the Commissioner's use of his powers has not been transparent, balanced or vigorous, nor has it been supportive of an industry practice approach to securing data. Accordingly, it concludes that the Privacy Commissioner's use of its regulatory powers is unlikely to result in any significant improvement to the security of personal information held by organisations in Australia.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Conventional treatment of distal intestinal obstruction syndrome (DIOS) with high doses of pancreatic enzymes, mucolytic agents, and enemas is neither predictably effective nor rapid in action. In 6 cystic fibrosis patients with DIOS a balanced, non-absorbable intestinal lavage solution produced clinical and radiological improvement and striking improvement in DIOS scores. It is suggested that a balanced intestinal lavage solution should be considered as an alternative treatment for DIOS in patients with cystic fibrosis.
Resumo:
This study investigated questions related to half-occlusion processing in human stereoscopic vision: (1) How does the depth location of a half-occluding figure affect the depth localization of adjacent monocular objects? (2) Is three-dimensional slant around vertical axis (geometric effect) affected by half-occlusion constraints? and (3) How the half-occlusion constraints and surface formation processes are manifested in stereoscopic capture? Our results showed that the depth localization of binocular objects affects the depth localization of discrete monocular objects. We also showed that the visual system has a preference for a frontoparallel surface interpretation if the half-occlusion configuration allows multiple interpretation alternatives. When the surface formation was constrained by textures, our results showed that a process of rematching spreading determines the resulting perception and that the spreading can be limited by illusory contours that support the presence of binocularly unmatched figures. The unmatched figures could be present, if the inducing figures producing the illusory surface contained binocular image differences that provided cues for quantitative da Vinci stereopsis. These findings provide evidence of the significant role of half-occlusions in stereoscopic processing.
Resumo:
A 2 × 2 factorial combination of thinned or unthinned, and pruned or unpruned 11-year-old Eucalyptus dunnii (DWG) and 12-year-old Corymbia citriodora subsp. variegata (CCV) was destructively sampled to provide 60 trees in total per species. Two 1.4 m long billets were cut from each tree and were rotary veneered in a spindleless lathe down to a 45 mm diameter core to expose knots which were classified as either alive, partially occluded or fully occluded. Non-destructive evaluation of a wider range of thinning treatments available in these trials was undertaken with Pilodyn and Fakopp tools. Disc samples were also taken for basic density and modulus of elasticity. Differences between treatments for all wood property assessments were generally small and not significantly different.Thinning and pruning had little effect on the stem diameter growth required to achieve occlusion, therefore occlusion would be more rapid after thinning due to more rapid stem diameter growth. The difference between the treatments of greatest management interest, thinned and pruned (T&P) and unthinned and unpruned (UT&UP) were small. The production of higher value clear wood produced after all knots had occluded, measured as the average stem diameter growth over occlusion of the three outermost knots, was approximately 2 centimetres diameter. Two of the treatments can be ruled out as viable management alternatives: (i) the effect of thinning without pruning (T&UP) is clear, leading to a large inner core of stem wood containing knots (large knotty core diameter) and (ii) pruning without thinning (UT&P) results in a small knotty core diameter, however the tree and therefore log diameters are also small.
Resumo:
This research is a step forward in discovering knowledge from databases of complex structure like tree or graph. Several data mining algorithms are developed based on a novel representation called Balanced Optimal Search for extracting implicit, unknown and potentially useful information like patterns, similarities and various relationships from tree data, which are also proved to be advantageous in analysing big data. This thesis focuses on analysing unordered tree data, which is robust to data inconsistency, irregularity and swift information changes, hence, in the era of big data it becomes a popular and widely used data model.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
In receive antenna selection (AS), only signals from a subset of the antennas are processed at any time by the limited number of radio frequency (RF) chains available at the receiver. Hence, the transmitter needs to send pilots multiple times to enable the receiver to estimate the channel state of all the antennas and select the best subset. Conventionally, the sensitivity of coherent reception to channel estimation errors has been tackled by boosting the energy allocated to all pilots to ensure accurate channel estimates for all antennas. Energy for pilots received by unselected antennas is mostly wasted, especially since the selection process is robust to estimation errors. In this paper, we propose a novel training method uniquely tailored for AS that transmits one extra pilot symbol that generates accurate channel estimates for the antenna subset that actually receives data. Consequently, the transmitter can selectively boost the energy allocated to the extra pilot. We derive closed-form expressions for the proposed scheme's symbol error probability for MPSK and MQAM, and optimize the energy allocated to pilot and data symbols. Through an insightful asymptotic analysis, we show that the optimal solution achieves full diversity and is better than the conventional method.
Resumo:
Computer Vision has seen a resurgence in the parts-based representation for objects over the past few years. The parts are usually annotated beforehand for training. We present an annotation free parts-based representation for the pedestrian using Non-Negative Matrix Factorization (NMF). We show that NMF is able to capture the wide range of pose and clothing of the pedestrians. We use a modified form of NMF i.e. NMF with sparsity constraints on the factored matrices. We also make use of Riemannian distance metric for similarity measurements in NMF space as the basis vectors generated by NMF aren't orthogonal. We show that for 1% drop in accuracy as compared to the Histogram of Oriented Gradients (HOG) representation we can achieve robustness to partial occlusion.
Resumo:
We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.
Resumo:
The problem addressed in this paper is concerned with an important issue faced by any green aware global company to keep its emissions within a prescribed cap. The specific problem is to allocate carbon reductions to its different divisions and supply chain partners in achieving a required target of reductions in its carbon reduction program. The problem becomes a challenging one since the divisions and supply chain partners, being autonomous, may exhibit strategic behavior. We use a standard mechanism design approach to solve this problem. While designing a mechanism for the emission reduction allocation problem, the key properties that need to be satisfied are dominant strategy incentive compatibility (DSIC) (also called strategy-proofness), strict budget balance (SBB), and allocative efficiency (AE). Mechanism design theory has shown that it is not possible to achieve the above three properties simultaneously. In the literature, a mechanism that satisfies DSIC and AE has recently been proposed in this context, keeping the budget imbalance minimal. Motivated by the observation that SBB is an important requirement, in this paper, we propose a mechanism that satisfies DSIC and SBB with slight compromise in allocative efficiency. Our experimentation with a stylized case study shows that the proposed mechanism performs satisfactorily and provides an attractive alternative mechanism for carbon footprint reduction by global companies.
Resumo:
A balance between excitatory and inhibitory synaptic currents is thought to be important for several aspects of information processing in cortical neurons in vivo, including gain control, bandwidth and receptive field structure. These factors will affect the firing rate of cortical neurons and their reliability, with consequences for their information coding and energy consumption. Yet how balanced synaptic currents contribute to the coding efficiency and energy efficiency of cortical neurons remains unclear. We used single compartment computational models with stochastic voltage-gated ion channels to determine whether synaptic regimes that produce balanced excitatory and inhibitory currents have specific advantages over other input regimes. Specifically, we compared models with only excitatory synaptic inputs to those with equal excitatory and inhibitory conductances, and stronger inhibitory than excitatory conductances (i.e. approximately balanced synaptic currents). Using these models, we show that balanced synaptic currents evoke fewer spikes per second than excitatory inputs alone or equal excitatory and inhibitory conductances. However, spikes evoked by balanced synaptic inputs are more informative (bits/spike), so that spike trains evoked by all three regimes have similar information rates (bits/s). Consequently, because spikes dominate the energy consumption of our computational models, approximately balanced synaptic currents are also more energy efficient than other synaptic regimes. Thus, by producing fewer, more informative spikes approximately balanced synaptic currents in cortical neurons can promote both coding efficiency and energy efficiency.