997 resultados para Analysis of binaries
Resumo:
Related-party (RP) transactions are said to be commonly used opportunistically in business and contribute to corporate failures. While periodic disclosure is widely accepted as an effective means of monitoring such transactions, research is scant, particularly in countries where business dealings may be more susceptible to corruption. This study investigates the nature and extent of corporate RP disclosures across six countries in the Asia-Pacific region. The key finding indicates that companies in countries with stronger regulatory enforcement, shareholders’ protection, and control for corruption, have more transparent RP disclosures. This evidence potentially contributes to reforms aimed at strengthening RP disclosure and compliance.
Resumo:
This thesis used Critical Discourse Analysis to investigate how a government policy and the newsprint media constructed discussion about young people’s participation in education or employment. The study found that a continuous narrative across both sites about government as a noble agent taking action to redress the social disruption caused by young people’s disengagement. Unlike the education policy, the newsprint media blamed young people who were disengaged and failed to recognise the barriers they often face. The study points to possibilities for utilising the power of narrative to build a more fair and rigorous discussion of issues in the public sphere.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.
Resumo:
A dual-scale model of the torrefaction of wood was developed and used to study industrial configurations. At the local scale, the computational code solves the coupled heat and mass transfer and the thermal degradation mechanisms of the wood components. At the global scale, the two-way coupling between the boards and the stack channels is treated as an integral component of the process. This model is used to investigate the effect of the stack configuration on the heat treatment of the boards. The simulations highlight that the exothermic reactions occurring in each single board can be accumulated along the stack. This phenomenon may result in a dramatic eterogeneity of the process and poses a serious risk of thermal runaway, which is often observed in industrial plants. The model is used to explain how thermal runaway can be lowered by increasing the airflow velocity, the sticker thickness or by gas flow reversal.
Resumo:
The latest paradigm shift in government, termed Transformational Government, puts the citizen in the centre of attention. Including citizens in the design of online one-stop portals can help governmental organisations to become more customer focussed. This study describes the initial efforts of an Australian state government to develop an information architecture to structure the content of their future one-stop portal. Hereby, card sorting exercises have been conducted and analysed, utilising contemporary approaches found in academic and non-scientific literature. This paper describes the findings of the card sorting exercises in this particular case and discusses the suitability of the applied approaches in general. These are distinguished into non-statistical, statistical, and hybrid approaches. Thus, on the one hand, this paper contributes to academia by describing the application of different card sorting approaches and discussing their strengths and weaknesses. On the other hand, this paper contributes to practice by explaining the approach that has been taken by the authors’ research partner in order to develop a customer-focussed governmental one-stop portal. Thus, they provide decision support for practitioners with regard to different analysis methods that can be used to complement recent approaches in Transformational Government.
Resumo:
This paper offers an analysis of the character animation in Tangled to develop a deeper understanding of how Disney has approached the extension of their traditional aesthetic into the CG medium.
Resumo:
Globalised communication in society today is characterised by multimodal forms of meaning making in the context of increased cultural and linguistic diversity. This research paper responds to these imperatives, applying Halliday's (1978, 1994) categories of systemic functional linguistics - representational or ideational, interactive or interpersonal, and compositional or textual meanings. Following the work of Kress (2000), van Leeuwen (Kress and van Leeuwen, 1996), and Jewitt (2006), multimodal semiotic analysis is applied to claymation movies that were collaboratively designed by Year 6 students. The significance of this analysis is the metalanguage for textual work in the kineikonic mode - moving images.
Resumo:
Purpose – This paper aims to provide insights into the moral values embodied by a popular social networking site (SNS), Facebook. Design/methodology/approach – This study is based upon qualitative fieldwork, involving participant observation, conducted over a two-year period. The authors adopt the position that technology as well as humans has a moral character in order to disclose ethical concerns that are not transparent to users of the site. Findings – Much research on the ethics of information systems has focused on the way that people deploy particular technologies, and the consequences arising, with a view to making policy recommendations and ethical interventions. By focusing on technology as a moral actor with reach across and beyond the internet, the authors reveal the complex and diffuse nature of ethical responsibility and the consequent implications for governance of SNS. Research limitations/implications – The authors situate their research in a body of work known as disclosive ethics, and argue for an ongoing process of evaluating SNS to reveal their moral importance. Along with that of other authors in the genre, this work is largely descriptive, but the paper engages with prior research by Brey and Introna to highlight the scope for theory development. Practical implications – Governance measures that require the developers of social networking sites to revise their designs fail to address the diffuse nature of ethical responsibility in this case. Such technologies need to be opened up to scrutiny on a regular basis to increase public awareness of the issues and thereby disclose concerns to a wider audience. The authors suggest that there is value in studying the development and use of these technologies in their infancy, or if established, in the experiences of novice users. Furthermore, flash points in technological trajectories can prove useful sites of investigation. Originality/value – Existing research on social networking sites either fails to address ethical concerns head on or adopts a tool view of the technologies so that the focus is on the ethical behaviour of users. The authors focus upon the agency, and hence the moral character, of technology to show both the possibilities for, and limitations of, ethical interventions in such cases.
Resumo:
Matched case–control research designs can be useful because matching can increase power due to reduced variability between subjects. However, inappropriate statistical analysis of matched data could result in a change in the strength of association between the dependent and independent variables or a change in the significance of the findings. We sought to ascertain whether matched case–control studies published in the nursing literature utilized appropriate statistical analyses. Of 41 articles identified that met the inclusion criteria, 31 (76%) used an inappropriate statistical test for comparing data derived from case subjects and their matched controls. In response to this finding, we developed an algorithm to support decision-making regarding statistical tests for matched case–control studies.
Resumo:
A range of authors from the risk management, crisis management, and crisis communications literature have proposed different models as a means of understanding components of crisis. A generic component of these sources has focused on preparedness practices before disturbance events and response practices during events. This paper provides a critical analysis of three key explanatory models of how crises escalate highlighting the strengths and limitations of each approach. The paper introduces an optimised conceptual model utilising components from the previous work under the four phases of pre-event, response, recovery, and post-event. Within these four phases, a ten step process is introduced that can enhance understanding of the progression of distinct stages of disturbance for different types of events. This crisis evolution framework is examined as a means to provide clarity and applicability to a range of infrastructure failure contexts and provide a path for further empirical investigation in this area.
Resumo:
KEEP CLEAR pavement markings are widely used at urban signalised intersections to indicate to drivers to avoid entering blocked intersections. For example, ‘Box junctions’ are most widely used in the United Kingdom and other European countries. However, in Australia, KEEP CLEAR markings are mostly used to improve access from side roads onto a main road, especially when the side road is very close to a signalised intersection. This paper aims to reveal how the KEEP CLEAR markings affect the dynamic performance of the queuing vehicles on the main road, where the side road access is near a signalised intersection. Raw traffic field data was collected from an intersection at the Gold Coast, Australia, and the Kanade–Lucas–Tomasi (KLT) feature tracker approach was used to extract dynamic vehicle data from the raw video footage. The data analysis reveals that the KEEP CLEAR markings generate positive effects on the queuing vehicles in discharge on the main road. This finding refutes the traditional viewpoint that the KEEP CLEAR pavement markings will cause delay for the queuing vehicles’ departure due to the enlarged queue spacing. Further studies are suggested in this paper as well.
Resumo:
In this study we develop a theorization of an Internet dating site as a cultural artifact. The site, Gaydar, is targeted at gay men. We argue that contemporary received representations of their sexuality figure heavily in the site’s focus by providing a cultural logic for the apparent ad hoc development trajectories of its varied commercial and non-‐commercial services. More specifically, we suggest that the growing sets of services related to the website are heavily enmeshed within current social practices and meanings. These practices and meanings are, in turn, shaped by the interactions and preferences of a variety of diverse groups involved in what is routinely seen within the mainstream literature as a singularly specific sexuality and cultural project. Thus, we attend to two areas – the influence of the various social engagements associated with Gaydar together with the further extension of its trajectory ‘beyond the web’. Through the case of Gaydar, we contribute a study that recognizes the need for attention to sexuality in information systems research and one which illustrates sexuality as a pivotal aspect of culture. We also draw from anthropology to theorize ICTs as cultural artifacts and provide insights into the contemporary phenomena of ICT enabled social networking.
Resumo:
The Action Lecture program is an innovative teaching method run in some nursery and primary schools in Paris and designed to improve pupils’ literacy. We report the results of an evaluation of this program. We describe the experimental protocol that was built to estimate the program’s impact on several types of indicators. Data were processed following a Differences-in-Differences (DID) method. Then we use the estimation of the impact on academic achievement to conduct a cost-effectiveness analysis and take a reduction of the class size program as a benchmark. The results are positive for the Action Lecture program.
Resumo:
Application of "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A refined plastic hinge method suitable for practical advanced analysis of steel frame structures comprising non-compact sections is presented in a companion paper. The method implicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the method for the analysis of steel frames comprising non-compact sections is established in this paper by comparison with a comprehensive range of analytical benchmark frame solutions. The refined plastic hinge method is shown to be more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations.
Resumo:
This thesis is a study of whether the Australian Clean Energy Package complies with the rules of the World Trade Organization. It examines the legal framework for the Australian carbon pricing mechanism and related arrangements, using World Trade Organization law as the framework for analysis. In doing so, this thesis deconstructs the Clean Energy Package by considering the legal properties of eligible emissions units, the assistance measures introduced by the Package and the liabilities created by the carbon pricing mechanism.