903 resultados para Fatal attacks
Resumo:
Quality of life (QoL) and Health-related quality of life (HRQoL) are becoming one of the key outcomes of health care due to increased respect for the subjective valuations and well-being of patients and an increasing part of the ageing population living with chronic, non-fatal conditions. Preference-based HRQoL measures enable estimation of health utility, which can be useful for rational rationing, evidence-based medicine and health policy. This study aimed to compare the individual severity and public health burden of major chronic conditions in Finland, including and focusing on reliably diagnosed psychiatric conditions. The study is based on the Health 2000 survey, a representative general population survey of 8028 Finns aged 30 and over. Depressive, anxiety and alcohol use disorders were diagnosed with the Composite International Diagnostic Interview (M-CIDI). HRQoL was measured with the 15D and the EQ-5D, with 83% response rate. This study found that people with psychiatric disorders had the lowest 15D HRQoL scores at all ages, in comparison to other main groups of chronic conditions. Considering 29 individual conditions, three of the four most severe (on 15D) were psychiatric disorders; the most severe was Parkinson s disease. Of the psychiatric disorders, chronic conditions that have sometimes been considered relatively mild - dysthymia, agoraphobia, generalized anxiety disorder and social phobia - were found to be the most severe. This was explained both by the severity of the impact of these disorders on mental health domains of HRQoL, and also by the fact that decreases were widespread on most dimensions of HRQoL. Considering the public health burden of conditions, musculoskeletal disorders were associated with the largest burden, followed by psychiatric disorders. Psychiatric disorders were associated with the largest burden at younger ages. Of individual conditions, the largest burden found was for depressive disorders, followed by urinary incontinence and arthrosis of the hip and knee. The public health burden increased greatly with age, so the ageing of the Finnish population will mean that the disease burden caused by chronic conditions will increase by a quarter up to year 2040, if morbidity patterns do not change. Investigating alcohol consumption and HRQoL revealed that although abstainers had poorer HRQoL than moderate drinkers, this was mainly due to many abstainers being former drinkers and having the poorest HRQoL. Moderate drinkers did not have significantly better HRQoL than abstainers who were not former drinkers. Psychiatric disorders are associated with a large part of the non-fatal disease burden in Finland. In particular anxiety disorders appear to be more severe and have a larger public health burden than previously thought.
Resumo:
This paper presents a detailed analysis of a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts in an area fire situation. Lanchester linear law attrition model is used to develop the dynamical equations governing the variation in force strength. Here we address a static resource allocation problem namely, Time-Zero-Allocation (TZA) where the resource allocation is done only at the initial time. Numerical examples are given to support the analytical results.
Resumo:
Background The leading causes of morbidity and mortality for people in high-income countries living with HIV are now non-AIDS malignancies, cardiovascular disease and other non-communicable diseases associated with ageing. This protocol describes the trial of HealthMap, a model of care for people with HIV (PWHIV) that includes use of an interactive shared health record and self-management support. The aims of the HealthMap trial are to evaluate engagement of PWHIV and healthcare providers with the model, and its effectiveness for reducing coronary heart disease risk, enhancing self-management, and improving mental health and quality of life of PWHIV. Methods/Design The study is a two-arm cluster randomised trial involving HIV clinical sites in several states in Australia. Doctors will be randomised to the HealthMap model (immediate arm) or to proceed with usual care (deferred arm). People with HIV whose doctors are randomised to the immediate arm receive 1) new opportunities to discuss their health status and goals with their HIV doctor using a HealthMap shared health record; 2) access to their own health record from home; 3) access to health coaching delivered by telephone and online; and 4) access to a peer moderated online group chat programme. Data will be collected from participating PWHIV (n = 710) at baseline, 6 months, and 12 months and from participating doctors (n = 60) at baseline and 12 months. The control arm will be offered the HealthMap intervention at the end of the trial. The primary study outcomes, measured at 12 months, are 1) 10-year risk of non-fatal acute myocardial infarction or coronary heart disease death as estimated by a Framingham Heart Study risk equation; and 2) Positive and Active Engagement in Life Scale from the Health Education Impact Questionnaire (heiQ). Discussion The study will determine the viability and utility of a novel technology-supported model of care for maintaining the health and wellbeing of people with HIV. If shown to be effective, the HealthMap model may provide a generalisable, scalable and sustainable system for supporting the care needs of people with HIV, addressing issues of equity of access. Trial registration Universal Trial Number (UTN) U111111506489; ClinicalTrial.gov Id NCT02178930 submitted 29 June 2014
Resumo:
Pre-school children grow and develop rapidly with age and their changing capabilities are reflected in the ways in which they are injured. Using coded and textual descriptions of transport-related injuries in children under five years of age from the Queensland Injury Surveillance Unit (QISU) this paper profiles the modes of such injuries by single year of age. The QISU collects information on all injury presentations to emergency department in hospitals throughout Queensland using both coded information and textual description. Almost all transport-related injuries in children under one year are due to motor vehicle crashes but these become proportionately less common thereafter, while injuries while cycling become proportionately more common with age. Slow-speed vehicle runovers peak at age one year but occur at all ages in the range. Bicycle-related fatalities are rare in this age group. If bicycle-related injuries are excluded, the profiles of fatal and non-fatal injuries are broadly similar. Comparison with a Queensland hospital series suggests that these results are broadly representative.
Resumo:
The MIT Lincoln Laboratory IDS evaluation methodology is a practical solution in terms of evaluating the performance of Intrusion Detection Systems, which has contributed tremendously to the research progress in that field. The DARPA IDS evaluation dataset has been criticized and considered by many as a very outdated dataset, unable to accommodate the latest trend in attacks. Then naturally the question arises as to whether the detection systems have improved beyond detecting these old level of attacks. If not, is it worth thinking of this dataset as obsolete? The paper presented here tries to provide supporting facts for the use of the DARPA IDS evaluation dataset. The two commonly used signature-based IDSs, Snort and Cisco IDS, and two anomaly detectors, the PHAD and the ALAD, are made use of for this evaluation purpose and the results support the usefulness of DARPA dataset for IDS evaluation.
Resumo:
Background The irreversible ErbB family blocker afatinib and the reversible EGFR tyrosine kinase inhibitor gefitinib are approved for first-line treatment of EGFR mutation-positive non-small-cell lung cancer (NSCLC). We aimed to compare the efficacy and safety of afatinib and gefitinib in this setting. Methods This multicentre, international, open-label, exploratory, randomised controlled phase 2B trial (LUX-Lung 7) was done at 64 centres in 13 countries. Treatment-naive patients with stage IIIB or IV NSCLC and a common EGFR mutation (exon 19 deletion or Leu858Arg) were randomly assigned (1:1) to receive afatinib (40 mg per day) or gefitinib (250 mg per day) until disease progression, or beyond if deemed beneficial by the investigator. Randomisation, stratified by EGFR mutation type and status of brain metastases, was done centrally using a validated number generating system implemented via an interactive voice or web-based response system with a block size of four. Clinicians and patients were not masked to treatment allocation; independent review of tumour response was done in a blinded manner. Coprimary endpoints were progression-free survival by independent central review, time-to-treatment failure, and overall survival. Efficacy analyses were done in the intention-to-treat population and safety analyses were done in patients who received at least one dose of study drug. This ongoing study is registered with ClinicalTrials.gov, number NCT01466660. Findings Between Dec 13, 2011, and Aug 8, 2013, 319 patients were randomly assigned (160 to afatinib and 159 to gefitinib). Median follow-up was 27·3 months (IQR 15·3–33·9). Progression-free survival (median 11·0 months [95% CI 10·6–12·9] with afatinib vs 10·9 months [9·1–11·5] with gefitinib; hazard ratio [HR] 0·73 [95% CI 0·57–0·95], p=0·017) and time-to-treatment failure (median 13·7 months [95% CI 11·9–15·0] with afatinib vs 11·5 months [10·1–13·1] with gefitinib; HR 0·73 [95% CI 0·58–0·92], p=0·0073) were significantly longer with afatinib than with gefitinib. Overall survival data are not mature. The most common treatment-related grade 3 or 4 adverse events were diarrhoea (20 [13%] of 160 patients given afatinib vs two [1%] of 159 given gefitinib) and rash or acne (15 [9%] patients given afatinib vs five [3%] of those given gefitinib) and liver enzyme elevations (no patients given afatinib vs 14 [9%] of those given gefitinib). Serious treatment-related adverse events occurred in 17 (11%) patients in the afatinib group and seven (4%) in the gefitinib group. Ten (6%) patients in each group discontinued treatment due to drug-related adverse events. 15 (9%) fatal adverse events occurred in the afatinib group and ten (6%) in the gefitinib group. All but one of these deaths were considered unrelated to treatment; one patient in the gefitinib group died from drug-related hepatic and renal failure. Interpretation Afatinib significantly improved outcomes in treatment-naive patients with EGFR-mutated NSCLC compared with gefitinib, with a manageable tolerability profile. These data are potentially important for clinical decision making in this patient population.
Resumo:
Recommender systems aggregate individual user ratings into predictions of products or services that might interest visitors. The quality of this aggregation process crucially affects the user experience and hence the effectiveness of recommenders in e-commerce. We present a characterization of nearest-neighbor collaborative filtering that allows us to disaggregate global recommender performance measures into contributions made by each individual rating. In particular, we formulate three roles-scouts, promoters, and connectors-that capture how users receive recommendations, how items get recommended, and how ratings of these two types are themselves connected, respectively. These roles find direct uses in improving recommendations for users, in better targeting of items and, most importantly, in helping monitor the health of the system as a whole. For instance, they can be used to track the evolution of neighborhoods, to identify rating subspaces that do not contribute ( or contribute negatively) to system performance, to enumerate users who are in danger of leaving, and to assess the susceptibility of the system to attacks such as shilling. We argue that the three rating roles presented here provide broad primitives to manage a recommender system and its community.
Resumo:
The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.
Resumo:
Bees are well known for being industrious pollinators. Some species, however, have taken to invading the nests of other colonies to steal food, nest material or the nest site itself. Despite the potential mortality costs due to fighting with an aggressive opponent, the prospects of a large bounty can be worth the risk. In this review, we aim to bring together current knowledge on intercolony fighting with a view to better understand the evolution of warfare in bees and identify avenues for future research. A review of literature reveals that at least 60 species of stingless bees are involved in heterospecific conflicts, either as attacking or victim colonies. The threat of invasion has led to the evolution of architectural, behavioural and morphological adaptations, such as narrow entrance tunnels, mud balls to block the entrance, decoy nests that direct invaders away from the brood chamber, fighting swarms, and soldiers that are skilled at immobilising attackers. Little is known about how victim colonies are selected, but a phylogenetically controlled analysis suggests that the notorious robber bee Lestrimelitta preferentially attacks colonies of species with more concentrated honey. Warfare among bees poses many interesting questions, including why species differ so greatly in their response to attacks and how these alternative strategies of obtaining food or new nest sites have evolved.
Resumo:
This paper develops a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts. The Lanchester attrition model is used to develop the dynamical equations governing the variation in force strength. Three different allocation schemes - Time-Zero-Allocation (TZA), Allocate-Assess-Reallocate (AAR), and Continuous Constant Allocation (CCA) - are considered and the optimal solutions are obtained in each case. Numerical examples are given to support the analytical results.
Resumo:
Plasmodium falciparum causes the most severe form of malaria that is fatal in many cases. Emergence of drug resistant strains of P. falciparum requires that new drug targets be-identified. This review considers in detail enzymes of the glycolytic pathway, purine salvage pathway, pyrimidine biosynthesis and proteases involved in catabolism of haemoglobin. Structural features of P. falciparum triosephosphate isomerase which could be exploited for parasite specific drug development have been highlighted. Utility of P. falciparum hypoxanthine-guanine-phosphoribosyltransferase, adenylosuccinate synthase, dihydroorotate dehydrogenase, thymidylate synthase-dihydrofolate reductase, cysteine and aspartic proteases have been elaborated in detail. The review also briefly touches upon other potential targets in P. falciparum
Kansanterveysongelman synty : Tuberkuloosi ja terveyden hallinta Suomessa ennen toista maailmansotaa
Resumo:
The study focuses on the emergence of tuberculosis as a public health problem and the development of the various methods to counteract it in Finland before the introduction of efficient methods of treatment in the 1940s and 50s. It covers a time period from year 1882 when the tuberculosis bacterium was identified to the 1930s when the early formation of tuberculosis work became established in Finland. During this time there occurred important changes in medicine, public health thinking and methods of personal health care that have been referred to as the bacteriological revolution. The study places tuberculosis prevention in this context and shows how the tuberculosis problem affected the government of health on all these three dimensions. The study is based on foucauldian analytics of government, which is supplemented with perspectives from contemporary science and technology studies. In addition, it utilises a broad array of work in medical history. The central research materials consist of medical journals, official programs and documents on tuberculosis policy, and health education texts. The general conclusions of the study are twofold. Firstly, the ensemble of tuberculosis work was formed from historically diverse and often conflicting elements. The identification of the pathogen was only the first step in the establishment of tuberculosis as a major public health problem. Important were also the attention of the science of hygiene and statistical reasoning that dominated public health thinking in the late 19th century. Furthermore, the adoption of the bacteriological tuberculosis doctrine in medicine, public health work and health education was profoundly influenced by previous understanding of the nature of the illness, of medical work, of the prevention of contagious diseases, and of personal health care. Also the two central institutions of tuberculosis work, sanatorium and dispensary, have heterogeneous origins and multifarious functions. Secondly, bacteriology represented in this study by tuberculosis remodelled medical knowledge and practices, the targets and methods of public health policy, and the doctrine of personal health care. Tuberculosis provided a strong argument for specific causes (if not cures) as well as laboratory methods in medicine. Tuberculosis prevention contributed substantially to the development whereby a comprehensive responsibility for the health of the population and public health work was added to the agenda of the state. Health advice on tuberculosis and other contagious diseases used dangerous bacteria to motivate personal health care and redefined it as protecting oneself from the attacks of external pathogens and strengthening oneself against their effects. Thus, tuberculosis work is one important root for the contemporary public concern for the health of the population and the imperative of personal health care.
Resumo:
This paper develops a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts. The Lanchester attrition model is used to develop the dynamical equations governing the variation in force strength. Three different allocation schemes - Time-Zero-Allocation (TZA), Allocate-Assess-Reallocate (AAR), and Continuous Constant Allocation (CCA) - are considered and the optimal solutions are obtained in each case. Numerical examples are given to support the analytical results.
Resumo:
Migraine is the common cause of chronic episodic headache, affecting 12%-15% of the Caucasian population (41 million Europeans and some half a million Finns), and causes considerable loss of quality of life to its sufferers, as well as being linked to increased risk for a wide range of conditions, from depression to stroke. Migraine is the 19th most severe disease in terms of disability-adjusted life years, and 9th among women. It is characterized by attacks of headache accompanied by sensitivity to external stimuli lasting 4-72 hours, and in a third of cases by neurological aura symptoms, such as loss of vision, speech or muscle function. The underlying pathophysiology, including what triggers migraine attacks and why they occur in the first place, is largely unknown. The aim of this study was to identify genetic factors associated with the hereditary susceptibility to migraine, in order to gain a better understanding of migraine mechanisms. In this thesis, we report the results of genetic linkage and association analyses on a Finnish migraine patient collection as well as migraineurs from Australia, Denmark, Germany, Iceland and the Netherlands. Altogether we studied genetic information of nearly 7,000 migraine patients and over 50,000 population-matched controls. We also developed a new migraine analysis method called the trait component analysis, which is based on individual patient responses instead of the clinical diagnosis. Using this method, we detected a number of new genetic loci for migraine, including on chromosome 17p13 (HLOD 4.65) and 10q22-q23 (female-specific HLOD 7.68) with significant evidence of linkage, along with five other loci (2p12, 8q12, 4q28-q31, 18q12-q22, and Xp22) detected with suggestive evidence of linkage. The 10q22-q23 locus was the first genetic finding in migraine to show linkage to the same locus and markers in multiple populations, with consistent detection in six different scans. Traditionally, ion channels have been thought to play a role in migraine susceptibility, but we were able to exclude any significant role for common variants in a candidate gene study of 155 ion transport genes. This was followed up by the first genome-wide association study in migraine, conducted on 2,748 migraine patients and 10,747 matched controls followed by a replication in 3,209 patients and 40,062 controls. In this study, we found interesting results with genome-wide significance, providing targets for future genetic and functional studies. Overall, we found several promising genetic loci for migraine providing a promising base for future studies in migraine.
Resumo:
Financial crises have shown that dramatic movements in one financial market can have a powerful impact on other markets. The paper proposes to use cobreaking to model comovements between financial markets during crises and to test for conta-gion. It finds evidence of cobreaking between stock returns in developed markets. Finding cobreaking has implications for the diversification of international investments. For emerging mar-ket stock returns the evidence of cobreaking is mainly due to the non-financial event of the 9/11 terrorist attacks in 2001. Fi-nancial crises originating in one emerging market do not spread to other markets, i.e., no contagion.