969 resultados para Synaptonemal complex failure
Resumo:
Background Sleep disturbances, including insomnia and sleep-disordered breathing, are a common complaint in people with heart failure and impair well-being. Exercise training (ET) improves quality of life in stable heart failure patients. ET also improves sleep quality in healthy older patients, but there are no previous intervention studies in heart failure patients. Aim The aim of this study was to examine the impact of ET on sleep quality in patients recently discharged from hospital with heart failure. Methods This was a sub-study of a multisite randomised controlled trial. Participants with a heart failure hospitalisation were randomised within six weeks of discharge to a 12-week disease management programme including exercise advice (n=52) or to the same programme with twice weekly structured ET (n=54). ET consisted of two one-hour supervised aerobic and resistance training sessions, prescribed and advanced by an exercise specialist. The primary outcome was change in Pittsburgh Sleep Quality Index (PSQI) between randomisation and week 12. Results At randomisation, 45% of participants reported poor sleep (PSQI≥5). PSQI global score improved significantly more in the ET group than the control group (–1.5±3.7 vs 0.4±3.8, p=0.03). Improved sleep quality correlated with improved exercise capacity and reduced depressive symptoms, but not with changes in body mass index or resting heart rate. Conclusion Twelve weeks of twice-weekly supervised ET improved sleep quality in patients recently discharged from hospital with heart failure.
Resumo:
The author, Dean Shepherd, is of entrepreneurship—how entrepreneurs think, decide to act, and feel. He recently realized that while his publications in academic journals have implications for entrepreneurs, those implications have remained relatively hidden in the text of the articles and hidden in articles published in journals largely inaccessible to those involved in the entrepreneurial process. This series is designed to bring the practical implications of his research to the forefront.
Resumo:
The author, Dean Shepherd, is of entrepreneurship—how entrepreneurs think, decide to act, and feel. He recently realized that while his publications in academic journals have implications for entrepreneurs, those implications have remained relatively hidden in the text of the articles and hidden in articles published in journals largely inaccessible to those involved in the entrepreneurial process. This series is designed to bring the practical implications of his research to the forefront.
Resumo:
The international tax system, designed a century ago, has not kept pace with the modern multinational entity rendering it ineffective in taxing many modern businesses according to economic activity. One of those modern multinational entities is the multinational financial institution (MNFI). The recent global financial crisis provides a particularly relevant and significant example of the failure of the current system on a global scale. The modern MNFI is increasingly undertaking more globalised and complex trading operations. A primary reason for the globalisation of financial institutions is that they typically ‘follow-the-customer’ into jurisdictions where international capital and international investors are required. The International Monetary Fund (IMF) recently reported that from 1995-2009, foreign bank presence in developing countries grew by 122 per cent. The same study indicates that foreign banks have a 20 per cent market share in OECD countries and 50 per cent in emerging markets and developing countries. Hence, most significant is that fact that MNFIs are increasingly undertaking an intermediary role in developing economies where they are financing core business activities such as mining and tourism. IMF analysis also suggests that in the future, foreign bank expansion will be greatest in emerging economies. The difficulties for developing countries in applying current international tax rules, especially the current traditional transfer pricing regime, are particularly acute in relation to MNFIs, which are the biggest users of tax havens and offshore finance. This paper investigates whether a unitary taxation approach which reflects economic reality would more easily and effectively ensure that the profits of MNFIs are taxed in the jurisdictions which give rise to those profits. It has previously been argued that the uniqueness of MNFIs results in a failure of the current system to accurately allocate profits and that unitary tax as an alternative could provide a sounder allocation model for international tax purposes. This paper goes a step further, and examines the practicalities of the implementation of unitary taxation for MNFIs in terms of the key components of such a regime, along with their their implications. This paper adopts a two-step approach in considering the implications of unitary taxation as a means of improved corporate tax coordination which requires international acceptance and agreement. First, the definitional issues of the unitary MNFI are examined and second, an appropriate allocation formula for this sector is investigated. To achieve this, the paper asks first, how the financial sector should be defined for the purposes of unitary taxation and what should constitute a unitary business for that sector and second, what is the ‘best practice’ model of an allocation formula for the purposes of the apportionment of the profits of the unitary business of a financial institution.
Resumo:
Background There are few data regarding the effectiveness of remote monitoring for older people with heart failure. We conducted a post-hoc sub-analysis of a previously published large Cochrane systematic review and meta-analysis of relevant randomized controlled trials to determine whether structured telephone support and telemonitoring were effective in this population. Methods A post hoc sub-analysis of a systematic review and meta-analysis that applied the Cochrane methodology was conducted. Meta-analyses of all-cause mortality, all-cause hospitalizations and heart failure-related hospitalizations were performed for studies where the mean or median age of participants was 70 or more years. Results The mean or median age of participants was 70 or more years in eight of the 16 (n=2,659/5,613; 47%) structured telephone support studies and four of the 11 (n=894/2,710; 33%) telemonitoring studies. Structured telephone support (RR 0.80; 95% CI=0.63-1.00) and telemonitoring (RR 0.56; 95% CI=0.41-0.76) interventions reduced mortality. Structured telephone support interventions reduced heart failure-related hospitalizations (RR 0.81; 95% CI=0.67-0.99). Conclusion Despite a systematic bias towards recruitment of individuals younger than the epidemiological average into the randomized controlled trials, older people with heart failure did benefit from structured telephone support and telemonitoring. These post-hoc sub-analysis results were similar to overall effects observed in the main meta-analysis. While further research is required to confirm these observational findings, the evidence at hand indicates that discrimination by age alone may be not be appropriate when inviting participation in a remote monitoring service for heart failure.
Resumo:
Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.
Resumo:
This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.
Resumo:
Insulated rail joints are critical for train safety as they control electrical signalling systems; unfortunately they exhibit excessive ratchetting of the railhead near the endpost insulators. This paper reports a three-dimensional global model of these joints under wheel–rail contact pressure loading and a sub-model examining the ratchetting failures of the railhead. The sub-model employs a non-linear isotropic–kinematic elastic–plastic material model and predicts stress/strain levels in the localised railhead zone adjacent to the endpost which is placed in the air gap between the two rail ends at the insulated rail joint. The equivalent plastic strain plot is utilised to capture the progressive railhead damage adequately. Associated field and laboratory testing results of damage to the railhead material suggest that the simulation results are reasonable.
Resumo:
Insulated rail joints (IRJs) are a primary component of the rail track safety and signalling systems. Rails are supported by two fishplates which are fastened by bolts and nuts and, with the support of sleepers and track ballast, form an integrated assembly. IRJ failure can result from progressive defects, the propagation of which is influenced by residual stresses in the rail. Residual stresses change significantly during service due to the complex deformation and damage effects associated with wheel rolling, sliding and impact. IRJ failures can occur when metal flows over the insulated rail gap (typically 6-8 mm width), breaks the electrically isolated section of track and results in malfunction of the track signalling system. In this investigation, residual stress measurements were obtained from rail-ends which had undergone controlled amounts of surface plastic deformation using a full scale wheel-on-track simulation test rig. Results were compared with those obtained from similar investigations performed on rail ends associated with ex-service IRJs. Residual stresses were measured by neutron diffraction at the Australian Nuclear Science and Technology Organisation (ANSTO). Measurements with constant gauge volume 3x3x3 mm3 were carried in the central vertical plane on 5mm thick sliced rail samples cut by an electric discharge machine (EDM). Stress evolution at the rail ends was found to exhibit characteristics similar to those of the ex-service rails, with a compressive zone of 5mm deep that is counterbalanced by a tension zone beneath, extending to a depth of around 15mm. However, in contrast to the ex-service rails, the type of stress distribution in the test-rig deformed samples was apparently different due to the localization of load under the particular test conditions. In the latter, in contrast with clear stress evolution, there was no obvious evolution of d0. Since d0 reflects rather long-term accumulation of crystal lattice damage and microstructural changes due to service load, the loading history of the test rig samples has not reached the same level as the ex-service rails. It is concluded that the wheel-on-rail simulation rig provides the potential capability for testing the wheel-rail rolling contact conditions in rails, rail ends and insulated rail joints.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
Complex numbers are a fundamental aspect of the mathematical formalism of quantum physics. Quantum-like models developed outside physics often overlooked the role of complex numbers. Specifically, previous models in Information Retrieval (IR) ignored complex numbers. We argue that to advance the use of quantum models of IR, one has to lift the constraint of real-valued representations of the information space, and package more information within the representation by means of complex numbers. As a first attempt, we propose a complex-valued representation for IR, which explicitly uses complex valued Hilbert spaces, and thus where terms, documents and queries are represented as complex-valued vectors. The proposal consists of integrating distributional semantics evidence within the real component of a term vector; whereas, ontological information is encoded in the imaginary component. Our proposal has the merit of lifting the role of complex numbers from a computational byproduct of the model to the very mathematical texture that unifies different levels of semantic information. An empirical instantiation of our proposal is tested in the TREC Medical Record task of retrieving cohorts for clinical studies.
Resumo:
Contemporary lipidomics protocols are dependent on conventional tandem mass spectrometry for lipid identification. This approach is extremely powerful for determining lipid class and identifying the number of carbons and the degree of unsaturation of any acyl-chain substituents. Such analyses are however, blind to isomeric variants arising from different carbon carbon bonding motifs within these chains including double bond position, chain branching, and cyclic structures. This limitation arises from the fact that conventional, low energy collision-induced dissociation of even-electron lipid ions does not give rise to product ions from intrachain fragmentation of the fatty acyl moieties. To overcome this limitation, we have applied radical-directed dissociation (RDD) to the study of lipids for the first time. In this approach, bifunctional molecules that contain a photocaged radical initiator and a lipid-adducting group, such as 4-iodoaniline and 4-iodobenzoic acid, are used to form noncovalent complexes (i.e., adduct ions) with a lipid during electrospray ionization. Laser irradiation of these complexes at UV wavelengths (266 nm) cleaves the carbon iodine bond to liberate a highly reactive phenyl radical. Subsequent activation of the nascent radical ions results in RDD with significant intrachain fragmentation of acyl moieties. This approach provides diagnostic fragments that are associated with the double bond position and the positions of chain branching in glycerophospholipids, sphingomyelins and triacylglycerols and thus can be used to differentiate isomeric lipids differing only in such motifs. RDD is demonstrated for well-defined lipid standards and also reveals lipid structural diversity in olive oil and human very-low density lipoprotein.
Resumo:
Effective machine fault prognostic technologies can lead to elimination of unscheduled downtime and increase machine useful life and consequently lead to reduction of maintenance costs as well as prevention of human casualties in real engineering asset management. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique and historical failure knowledge embedded in the closed loop diagnostic and prognostic system. To estimate a discrete machine degradation state which can represent the complex nature of machine degradation effectively, the proposed prognostic model employed a classification algorithm which can use a number of damage sensitive features compared to conventional time series analysis techniques for accurate long-term prediction. To validate the feasibility of the proposed model, the five different level data of typical four faults from High Pressure Liquefied Natural Gas (HP-LNG) pumps were used for the comparison of intelligent diagnostic test using five different classification algorithms. In addition, two sets of impeller-rub data were analysed and employed to predict the remnant life of pump based on estimation of health state probability using the Support Vector Machine (SVM) classifier. The results obtained were very encouraging and showed that the proposed prognostics system has the potential to be used as an estimation tool for machine remnant life prediction in real life industrial applications.
Resumo:
In Service-oriented Architectures, business processes can be realized by composing loosely coupled services. The problem of QoS-aware service composition is widely recognized in the literature. Existing approaches on computing an optimal solution to this problem tackle structured business processes, i.e., business processes which are composed of XOR-block, AND-block, and repeat loop orchestration components. As of yet, OR-block and unstructured orchestration components have not been sufficiently considered in the context of QoS-aware service composition. The work at hand addresses this shortcoming. An approach for computing an optimal solution to the service composition problem is proposed considering the structured orchestration components, such as AND/XOR/OR-block and repeat loop, as well as unstructured orchestration components.
Resumo:
Tissue Engineering is a promising emerging field that studies the intrinsic regenerative potential of the human body and uses it to restore functionality of damaged organs or tissues unable of self-healing due to illness or ageing. In order to achieve regeneration using Tissue Engineering strategies, it is first necessary to study the properties of the native tissue and determine the cause of tissue failure; second, to identify an optimum population of cells capable of restoring its functionality; and third, to design and manufacture a cellular microenvironment in which those specific cells are directed towards the desired cellular functions. The design of the artificial cellular niche has a tremendous importance, because cells will feel and respond to both its biochemical and biophysical properties very differently. In particular, the artificial niche will act as a physical scaffold for the cells, allowing their three-dimensional spatial organization; also, it will provide mechanical stability to the artificial construct; and finally, it will supply biochemical and mechanical cues to control cellular growth, migration, differentiation and synthesis of natural extracellular matrix. During the last decades, many scientists have made great contributions to the field of Tissue Engineering. Even though this research has frequently been accompanied by vast investments during extended periods of time, yet too often these efforts have not been enough to translate the advances into new clinical therapies. More and more scientists in this field are aware of the need of rational experimental designs before carrying out complex, expensive and time-consuming in vitro and in vivo trials. This review highlights the importance of computer modeling and novel biofabrication techniques as critical key players for a rational design of artificial cellular niches in Tissue Engineering.