908 resultados para analysis of terrorist issue
Resumo:
Intra-host sequence data from RNA viruses have revealed the ubiquity of defective viruses in natural viral populations, sometimes at surprisingly high frequency. Although defective viruses have long been known to laboratory virologists, their relevance in clinical and epidemiological settings has not been established. The discovery of long-term transmission of a defective lineage of dengue virus type 1 (DENV-1) in Myanmar, first seen in 2001, raised important questions about the emergence of transmissible defective viruses and their role in viral epidemiology. By combining phylogenetic analyses and dynamical modelling, we investigate how evolutionary and ecological processes at the intra-host and inter-host scales shaped the emergence and spread of the defective DENV-1 lineage. We show that this lineage of defective viruses emerged between June 1998 and February 2001, and that the defective virus was transmitted primarily through co-transmission with the functional virus to uninfected individuals. We provide evidence that, surprisingly, this co-transmission route has a higher transmission potential than transmission of functional dengue viruses alone. Consequently, we predict that the defective lineage should increase overall incidence of dengue infection, which could account for the historically high dengue incidence reported in Myanmar in 2001-2002. Our results show the unappreciated potential for defective viruses to impact the epidemiology of human pathogens, possibly by modifying the virulence-transmissibility trade-off, or to emerge as circulating infections in their own right. They also demonstrate that interactions between viral variants, such as complementation, can open new pathways to viral emergence.
Resumo:
There is currently little information available about reasons for contraceptive use or non-use among young Australian women and the reasons for choosing specific types of contraceptive methods. A comprehensive life course perspective of women's experiences in using and obtaining contraceptives is lacking, particularly relating to women's perceived or physical barriers to access. This paper presents an analysis of qualitative data gathered from free-text comments provided by women born between 1973 and 1978 as part of their participation in the Australian Longitudinal Study on Women's Health. The Australian Longitudinal Study on Women's Health is a large cohort study involving over 40,000 women from three age groups (aged 18-23, aged 40-45 and aged 70-75) who were selected from the database of Medicare the Australian universal health insurance system in 1995. The women have been surveyed every 3 years about their health by mailed self-report surveys, and more recently online. Written comments from 690 women across five surveys from 1996 (when they were aged 18-23 years) to 2009 (aged 31-36 years) were examined. Factors relating to contraceptive use and barriers to access were identified and explored using thematic analysis. Side-effects, method satisfaction, family timing, and hormonal balance were relevant to young women using contraception. Most women who commented about a specific contraceptive method wrote about the oral contraceptive pill. While many women were positive or neutral about their method, noting its convenience or non-contraceptive benefits, many others were concerned about adverse effects, affordability, method failure, and lack of choice. Negative experiences with health services, lack of information, and cost were identified as barriers to access. As the cohort aged over time, method choice, changing patterns of use, side-effects, and negative experiences with health services remained important themes. Side-effects, convenience, and family timing play important roles in young Australian women's experiences of contraception and barriers to access. Contrary to assumptions, barriers to contraceptive access continue to be experienced by young women as they move into adulthood. Further research is needed about how to decrease barriers to contraceptive use and minimise negative experiences in order to ensure optimal contraceptive access for Australian women.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.
Resumo:
Neutrophils serve as an intriguing model for the study of innate immune cellular activity induced by physiological stress. We measured changes in the transcriptome of circulating neutrophils following an experimental exercise trial (EXTRI) consisting of 1 h of intense cycling immediately followed by 1 h of intense running. Blood samples were taken at baseline, 3 h, 48 h, and 96 h post-EXTRI from eight healthy, endurance-trained, male subjects. RNA was extracted from isolated neutrophils. Differential gene expression was evaluated using Illumina microarrays and validated with quantitative PCR. Gene set enrichment analysis identified enriched molecular signatures chosen from the Molecular Signatures Database. Blood concentrations of muscle damage indexes, neutrophils, interleukin (IL)-6 and IL-10 were increased (P < 0.05) 3 h post-EXTRI. Upregulated groups of functionally related genes 3 h post-EXTRI included gene sets associated with the recognition of tissue damage, the IL-1 receptor, and Toll-like receptor (TLR) pathways (familywise error rate, P value < 0.05). The core enrichment for these pathways included TLRs, low-affinity immunoglobulin receptors, S100 calcium binding protein A12, and negative regulators of innate immunity, e.g., IL-1 receptor antagonist, and IL-1 receptor associated kinase-3. Plasma myoglobin changes correlated with neutrophil TLR4 gene expression (r = 0.74; P < 0.05). Neutrophils had returned to their nonactivated state 48 h post-EXTRI, indicating that their initial proinflammatory response was transient and rapidly counterregulated. This study provides novel insight into the signaling mechanisms underlying the neutrophil responses to endurance exercise, suggesting that their transcriptional activity was particularly induced by damage-associated molecule patterns, hypothetically originating from the leakage of muscle components into the circulation.
Resumo:
Related-party (RP) transactions are said to be commonly used opportunistically in business and contribute to corporate failures. While periodic disclosure is widely accepted as an effective means of monitoring such transactions, research is scant, particularly in countries where business dealings may be more susceptible to corruption. This study investigates the nature and extent of corporate RP disclosures across six countries in the Asia-Pacific region. The key finding indicates that companies in countries with stronger regulatory enforcement, shareholders’ protection, and control for corruption, have more transparent RP disclosures. This evidence potentially contributes to reforms aimed at strengthening RP disclosure and compliance.
Resumo:
This thesis used Critical Discourse Analysis to investigate how a government policy and the newsprint media constructed discussion about young people’s participation in education or employment. The study found that a continuous narrative across both sites about government as a noble agent taking action to redress the social disruption caused by young people’s disengagement. Unlike the education policy, the newsprint media blamed young people who were disengaged and failed to recognise the barriers they often face. The study points to possibilities for utilising the power of narrative to build a more fair and rigorous discussion of issues in the public sphere.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.
Resumo:
A dual-scale model of the torrefaction of wood was developed and used to study industrial configurations. At the local scale, the computational code solves the coupled heat and mass transfer and the thermal degradation mechanisms of the wood components. At the global scale, the two-way coupling between the boards and the stack channels is treated as an integral component of the process. This model is used to investigate the effect of the stack configuration on the heat treatment of the boards. The simulations highlight that the exothermic reactions occurring in each single board can be accumulated along the stack. This phenomenon may result in a dramatic eterogeneity of the process and poses a serious risk of thermal runaway, which is often observed in industrial plants. The model is used to explain how thermal runaway can be lowered by increasing the airflow velocity, the sticker thickness or by gas flow reversal.
Resumo:
The latest paradigm shift in government, termed Transformational Government, puts the citizen in the centre of attention. Including citizens in the design of online one-stop portals can help governmental organisations to become more customer focussed. This study describes the initial efforts of an Australian state government to develop an information architecture to structure the content of their future one-stop portal. Hereby, card sorting exercises have been conducted and analysed, utilising contemporary approaches found in academic and non-scientific literature. This paper describes the findings of the card sorting exercises in this particular case and discusses the suitability of the applied approaches in general. These are distinguished into non-statistical, statistical, and hybrid approaches. Thus, on the one hand, this paper contributes to academia by describing the application of different card sorting approaches and discussing their strengths and weaknesses. On the other hand, this paper contributes to practice by explaining the approach that has been taken by the authors’ research partner in order to develop a customer-focussed governmental one-stop portal. Thus, they provide decision support for practitioners with regard to different analysis methods that can be used to complement recent approaches in Transformational Government.
Resumo:
This paper offers an analysis of the character animation in Tangled to develop a deeper understanding of how Disney has approached the extension of their traditional aesthetic into the CG medium.
Resumo:
Globalised communication in society today is characterised by multimodal forms of meaning making in the context of increased cultural and linguistic diversity. This research paper responds to these imperatives, applying Halliday's (1978, 1994) categories of systemic functional linguistics - representational or ideational, interactive or interpersonal, and compositional or textual meanings. Following the work of Kress (2000), van Leeuwen (Kress and van Leeuwen, 1996), and Jewitt (2006), multimodal semiotic analysis is applied to claymation movies that were collaboratively designed by Year 6 students. The significance of this analysis is the metalanguage for textual work in the kineikonic mode - moving images.
Resumo:
Purpose – This paper aims to provide insights into the moral values embodied by a popular social networking site (SNS), Facebook. Design/methodology/approach – This study is based upon qualitative fieldwork, involving participant observation, conducted over a two-year period. The authors adopt the position that technology as well as humans has a moral character in order to disclose ethical concerns that are not transparent to users of the site. Findings – Much research on the ethics of information systems has focused on the way that people deploy particular technologies, and the consequences arising, with a view to making policy recommendations and ethical interventions. By focusing on technology as a moral actor with reach across and beyond the internet, the authors reveal the complex and diffuse nature of ethical responsibility and the consequent implications for governance of SNS. Research limitations/implications – The authors situate their research in a body of work known as disclosive ethics, and argue for an ongoing process of evaluating SNS to reveal their moral importance. Along with that of other authors in the genre, this work is largely descriptive, but the paper engages with prior research by Brey and Introna to highlight the scope for theory development. Practical implications – Governance measures that require the developers of social networking sites to revise their designs fail to address the diffuse nature of ethical responsibility in this case. Such technologies need to be opened up to scrutiny on a regular basis to increase public awareness of the issues and thereby disclose concerns to a wider audience. The authors suggest that there is value in studying the development and use of these technologies in their infancy, or if established, in the experiences of novice users. Furthermore, flash points in technological trajectories can prove useful sites of investigation. Originality/value – Existing research on social networking sites either fails to address ethical concerns head on or adopts a tool view of the technologies so that the focus is on the ethical behaviour of users. The authors focus upon the agency, and hence the moral character, of technology to show both the possibilities for, and limitations of, ethical interventions in such cases.
Resumo:
Matched case–control research designs can be useful because matching can increase power due to reduced variability between subjects. However, inappropriate statistical analysis of matched data could result in a change in the strength of association between the dependent and independent variables or a change in the significance of the findings. We sought to ascertain whether matched case–control studies published in the nursing literature utilized appropriate statistical analyses. Of 41 articles identified that met the inclusion criteria, 31 (76%) used an inappropriate statistical test for comparing data derived from case subjects and their matched controls. In response to this finding, we developed an algorithm to support decision-making regarding statistical tests for matched case–control studies.
Resumo:
A range of authors from the risk management, crisis management, and crisis communications literature have proposed different models as a means of understanding components of crisis. A generic component of these sources has focused on preparedness practices before disturbance events and response practices during events. This paper provides a critical analysis of three key explanatory models of how crises escalate highlighting the strengths and limitations of each approach. The paper introduces an optimised conceptual model utilising components from the previous work under the four phases of pre-event, response, recovery, and post-event. Within these four phases, a ten step process is introduced that can enhance understanding of the progression of distinct stages of disturbance for different types of events. This crisis evolution framework is examined as a means to provide clarity and applicability to a range of infrastructure failure contexts and provide a path for further empirical investigation in this area.
Resumo:
KEEP CLEAR pavement markings are widely used at urban signalised intersections to indicate to drivers to avoid entering blocked intersections. For example, ‘Box junctions’ are most widely used in the United Kingdom and other European countries. However, in Australia, KEEP CLEAR markings are mostly used to improve access from side roads onto a main road, especially when the side road is very close to a signalised intersection. This paper aims to reveal how the KEEP CLEAR markings affect the dynamic performance of the queuing vehicles on the main road, where the side road access is near a signalised intersection. Raw traffic field data was collected from an intersection at the Gold Coast, Australia, and the Kanade–Lucas–Tomasi (KLT) feature tracker approach was used to extract dynamic vehicle data from the raw video footage. The data analysis reveals that the KEEP CLEAR markings generate positive effects on the queuing vehicles in discharge on the main road. This finding refutes the traditional viewpoint that the KEEP CLEAR pavement markings will cause delay for the queuing vehicles’ departure due to the enlarged queue spacing. Further studies are suggested in this paper as well.