960 resultados para European copyright code
Resumo:
Dear Editor We thank Dr Klek for his interest in our article and giving us the opportunity to clarify our study and share our thoughts. Our study looks at the prevalence of malnutrition in an acute tertiary hospital and tracked the outcomes prospectively.1 There are a number of reasons why we chose Subjective Global Assessment (SGA) to determine the nutritional status of patients. Firstly, we took the view that nutrition assessment tools should be used to determine nutrition status and diagnose presence and severity of malnutrition; whereas the purpose of nutrition screening tools are to identify individuals who are at risk of malnutrition. Nutritional assessment rather than screening should be used as the basis for planning and evaluating nutrition interventions for those diagnosed with malnutrition. Secondly, Subjective Global Assessment (SGA) has been well accepted and validated as an assessment tool to diagnose the presence and severity of malnutrition in clinical practice.2, 3 It has been used in many studies as a valid prognostic indicator of a range of nutritional and clinical outcomes.4, 5, 6 On the other hand, Malnutrition Universal Screening Tool (MUST)7 and Nutrition Risk Screening 2002 (NRS 2002)8 have been established as screening rather than assessment tools.
Resumo:
The system of self regulation of advertising in mass-media was a dream scenario. If stakeholders complained and the advertisement was deemed offensive by an expert panel, it was an easy matter to withdraw the advertisement from mass media and from public attention. This was done locally, according to the cultural values and aesthetics of the population and the mandate of the self regulation board. To advertising regulators, the internet became their worst nightmare. The system of self regulation was no longer closed, and could be circumvented by placing the offending advertisements online. The system of self regulation was also no longer local, but global. All internet users had access to the same advertisements, regardless of cultural considerations. The awakening of global advertising self regulation is something that demands discussion. It is of value to all conference goers of AAA 2010 Europe, as it affects all advertising academics and all stakeholders in the advertising process. As the leading advertising body seeking to bring global advertising issues to a new venue in Europe, the AAA 2010 European Conference seems ripe for a special session on advertising self regulation. This is especially true as the panel contributes a European, US and Asia-Pacific viewpoint.
Resumo:
The global economy experienced continuous growth from 2002 to 2007 until the U.S. subprime mortgage crisis caused instability in worldwide stock markets. Simultaneously, global CEO turnover continued to fall to 13.8 percent in 2007. In contrast, the CEO turnover rate in Australia increased to 18 percent in 2007. The purpose of this paper is to determine under what conditions a change in a CEO is associated with firm performance. Succinctly, does the firm’s decision to replace the CEO depend on the CEO’s human capital or firm performance? The empirical study of Australian listed firms (2005 – 2008) shows that firm performance is not a determinant of CEO turnover, rather a CEO with less valuable human capital is more likely to be replaced. The study also finds that merely changing the CEO is not associated firm performance. Rather, there is a positive association between firm performance and the successor’s general human capital for firms that replace the CEO. Specifically, it is the internal successor’s general human capital that is an important determinant of increasing firm performance. These results are important because they imply that CEO turnover is a result of a more active market for CEOs and contributes to explaining why firms retain CEOs despite poor firm performance.
Resumo:
Dose kernels may be used to calculate dose distributions in radiotherapy (as described by Ahnesjo et al., 1999). Their calculation requires use of Monte Carlo methods, usually by forcing interactions to occur at a point. The Geant4 Monte Carlo toolkit provides a capability to force interactions to occur in a particular volume. We have modified this capability and created a Geant4 application to calculate dose kernels in cartesian, cylindrical, and spherical scoring systems. The simulation considers monoenergetic photons incident at the origin of a 3 m x 3 x 9 3 m water volume. Photons interact via compton, photo-electric, pair production, and rayleigh scattering. By default, Geant4 models photon interactions by sampling a physical interaction length (PIL) for each process. The process returning the smallest PIL is then considered to occur. In order to force the interaction to occur within a given length, L_FIL, we scale each PIL according to the formula: PIL_forced = L_FIL 9 (1 - exp(-PIL/PILo)) where PILo is a constant. This ensures that the process occurs within L_FIL, whilst correctly modelling the relative probability of each process. Dose kernels were produced for an incident photon energy of 0.1, 1.0, and 10.0 MeV. In order to benchmark the code, dose kernels were also calculated using the EGSnrc Edknrc user code. Identical scoring systems were used; namely, the collapsed cone approach of the Edknrc code. Relative dose difference images were then produced. Preliminary results demonstrate the ability of the Geant4 application to reproduce the shape of the dose kernels; median relative dose differences of 12.6, 5.75, and 12.6 % were found for an incident photon energy of 0.1, 1.0, and 10.0 MeV respectively.
Resumo:
The double-stranded conformation of cellular DNA is a central aspect of DNA stabilisation and protection. The helix preserves the genetic code against chemical and enzymatic degradation, metabolic activation, and formation of secondary structures. However, there are various instances where single-stranded DNA is exposed, such as during replication or transcription, in the synthesis of chromosome ends, and following DNA damage. In these instances, single-stranded DNA binding proteins are essential for the sequestration and processing of single-stranded DNA. In order to bind single-stranded DNA, these proteins utilise a characteristic and evolutionary conserved single-stranded DNA-binding domain, the oligonucleotide/oligosaccharide-binding (OB)-fold. In the current review we discuss a subset of these proteins involved in the direct maintenance of genomic stability, an important cellular process in the conservation of cellular viability and prevention of malignant transformation. We discuss the central roles of single-stranded DNA binding proteins from the OB-fold domain family in DNA replication, the restart of stalled replication forks, DNA damage repair, cell cycle-checkpoint activation, and telomere maintenance.
Resumo:
This report provides an overview of the results of a collaborative research project titled "A model for research supervision of international students in engineering and information technology disciplines". This project aimed to identify factors influencing the success of culturally and linguistically diverse (CALD) higher degree research (HDR) students in the fields of Engineering and Information Technology at three Australian Universities: Queensland University of Technology, The University of Western Australia and Curtin University.
Resumo:
Light gauge steel frame wall systems are commonly used in industrial and commercial buildings, and there is a need for simple fire design rules to predict their load capacities and fire resistance ratings. During fire events, the light gauge steel frame wall studs are subjected to non-uniform temperature distributions that cause thermal bowing, neutral axis shift and magnification effects and thus resulting in a combined axial compression and bending action on the studs. In this research, a series of full-scale fire tests was conducted first to evaluate the performance of light gauge steel frame wall systems with eight different wall configurations under standard fire conditions. Finite element models of light gauge steel frame walls were then developed, analysed under transient and steady-state conditions and validated using full-scale fire tests. Using the results from fire tests and finite element analyses, a detailed investigation was undertaken into the prediction of axial compression strength and failure times of light gauge steel frame wall studs in standard fires using the available fire design rules based on Australian, American and European standards. The results from both fire tests and finite element analyses were used to investigate the ability of these fire design rules to include the complex effects of non-uniform temperature distributions and their accuracy in predicting the axial compression strength of wall studs and the failure times. Suitable modifications were then proposed to the fire design rules. This article presents the details of this investigation on the fire design rules of light gauge steel frame walls and the results.
Resumo:
Recent fire research into the behaviour of light gauge steel frame (LSF) wall systems has devel-oped fire design rules based on Australian and European cold-formed steel design standards, AS/NZS 4600 and Eurocode 3 Part 1.3. However, these design rules are complex since the LSF wall studs are subjected to non-uniform elevated temperature distributions when the walls are exposed to fire from one side. Therefore this paper proposes an alternative design method for routine predictions of fire resistance rating of LSF walls. In this method, suitable equations are recommended first to predict the idealised stud time-temperature pro-files of eight different LSF wall configurations subject to standard fire conditions based on full scale fire test results. A new set of equations was then proposed to find the critical hot flange (failure) temperature for a giv-en load ratio for the same LSF wall configurations with varying steel grades and thickness. These equations were developed based on detailed finite element analyses that predicted the axial compression capacities and failure times of LSF wall studs subject to non-uniform temperature distributions with varying steel grades and thicknesses. This paper proposes a simple design method in which the two sets of equations developed for time-temperature profiles and critical hot flange temperatures are used to find the failure times of LSF walls. The proposed method was verified by comparing its predictions with the results from full scale fire tests and finite element analyses. This paper presents the details of this study including the finite element models of LSF wall studs, the results from relevant fire tests and finite element analyses, and the proposed equations.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.