949 resultados para analytical approaches


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With increasing concern about consumer product-related injuries in Australia, product safety regulators need evidence-based research to understand risks and patterns to inform their decision making. This study analysed paediatric injury data to identify and quantify product-related injuries in children to inform product safety prioritisation. This study provides information on novel techniques for interrogating health data to identify trends and patterns in product-related injuries to inform strategic directions in this growing area of concern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the finite element modelling of structural frames, external loads such as wind loads, dead loads and imposed loads usually act along the elements rather than at the nodes only. Conventionally, when an element is subjected to these general transverse element loads, they are usually converted to nodal forces acting at the ends of the elements by either lumping or consistent load approaches. In addition, it is especially important for an element subjected to the first- and second-order elastic behaviour, to which the steel structure is critically prone to; in particular the thin-walled steel structures, when the stocky element section may be generally critical to the inelastic behaviour. In this sense, the accurate first- and second-order elastic displacement solutions of element load effect along an element is vitally crucial, but cannot be simulated using neither numerical nodal nor consistent load methods alone, as long as no equilibrium condition is enforced in the finite element formulation, which can inevitably impair the structural safety of the steel structure particularly. It can be therefore regarded as a unique element load method to account for the element load nonlinearly. If accurate displacement solution is targeted for simulating the first- and second-order elastic behaviour on an element on the basis of sophisticated non-linear element stiffness formulation, the numerous prescribed stiffness matrices must indispensably be used for the plethora of specific transverse element loading patterns encountered. In order to circumvent this shortcoming, the present paper proposes a numerical technique to include the transverse element loading in the non-linear stiffness formulation without numerous prescribed stiffness matrices, and which is able to predict structural responses involving the effect of first-order element loads as well as the second-order coupling effect between the transverse load and axial force in the element. This paper shows that the principle of superposition can be applied to derive the generalized stiffness formulation for element load effect, so that the form of the stiffness matrix remains unchanged with respect to the specific loading patterns, but with only the magnitude of the loading (element load coefficients) being needed to be adjusted in the stiffness formulation, and subsequently the non-linear effect on element loadings can be commensurate by updating the magnitude of element load coefficients through the non-linear solution procedures. In principle, the element loading distribution is converted into a single loading magnitude at mid-span in order to provide the initial perturbation for triggering the member bowing effect due to its transverse element loads. This approach in turn sacrifices the effect of element loading distribution except at mid-span. Therefore, it can be foreseen that the load-deflection behaviour may not be as accurate as those at mid-span, but its discrepancy is still trivial as proved. This novelty allows for a very useful generalised stiffness formulation for a single higher-order element with arbitrary transverse loading patterns to be formulated. Moreover, another significance of this paper is placed on shifting the nodal response (system analysis) to both nodal and element response (sophisticated element formulation). For the conventional finite element method, such as the cubic element, all accurate solutions can be only found at node. It means no accurate and reliable structural safety can be ensured within an element, and as a result, it hinders the engineering applications. The results of the paper are verified using analytical stability function studies, as well as with numerical results reported by independent researchers on several simple frames.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research developed and applied an evaluative framework to analyse multiple scales of decision-making for environmental management planning. It is the first exploration of the sociological theory of structural-functionalism and its usefulness to support evidence based decision-making in a planning context. The framework was applied to analyse decision-making in Queensland's Cape York Peninsula and Wet Tropics regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lecturing is a traditional method for teaching in discipline-based teaching environments and its success in legal discipline depends upon its alignment with learner backgrounds, learning objectives and the lecturing approaches utilised in the classes. In a situation where students do not have any prior knowledge of the given discipline that requires a particular lecturing approach, a mismatch in such an alignment would place learner knowledge acquisition into a challenging situation. From this perspective, this study tests the suitability of two dominant lecturing approaches—the case and the law-based lecturing approaches. It finds that a lecturer should put more emphasis on the case-based approach while lecturing to non-law background business students at the postgraduate level, provided that such an emphasis should be relative to the cognitive ability of the students and their motivation for learning law units.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nth-Dimensional Truncated Polynomial Ring (NTRU) is a lattice-based public-key cryptosystem that offers encryption and digital signature solutions. It was designed by Silverman, Hoffstein and Pipher. The NTRU cryptosystem was patented by NTRU Cryptosystems Inc. (which was later acquired by Security Innovations) and available as IEEE 1363.1 and X9.98 standards. NTRU is resistant to attacks based on Quantum computing, to which the standard RSA and ECC public-key cryptosystems are vulnerable to. In addition, NTRU has higher performance advantages over these cryptosystems. Considering this importance of NTRU, it is highly recommended to adopt NTRU as part of a cipher suite along with widely used cryptosystems for internet security protocols and applications. In this paper, we present our analytical study on the implementation of NTRU encryption scheme which serves as a guideline for security practitioners who are novice to lattice-based cryptography or even cryptography. In particular, we show some non-trivial issues that should be considered towards a secure and efficient NTRU implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Debates over the legitimacy and legality of prostitution have characterised human trafficking discourse for the last two decades. This article identifies the extent to which competing perspectives concerning the legitimacy of prostitution have influenced anti-trafficking policy in Australia and the United States, and argues that each nation-state’s approach to domestic sex work has influenced trafficking legislation. The legal status of prostitution in each country, and feminist influences on prostitution law reform, have had a significant impact on the nature of the legislation adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Employment on the basis of merit is the foundation of Australia’s equal opportunity legislation, beginning with the Affirmative Action (Equal Opportunity for Women) Act 1986, and continuing through the Equal Opportunity for Women in the Workplace Act 1999 to the Workplace Gender Equality Act 2012, all of which require organisations with more than 100 employees to produce an organisational program promoting employment equity for women (WGEA 2014a; Strachan, Burgess & Henderson 2007). The issue of merit was seen as critically important to the objectives of the original 1986 Act and the Affirmative Action Agency produced two monographs in 1988 written by Clare Burton: Redefining Merit (Burton 1988a) and Gender Bias in Job Evaluation (Burton 1988b) which provided practical advice. Added to this, in 1987 the Australian Government Publishing Service published Women’s Worth: Pay Equity and Job Evaluation in Australia (Burton, Hag & Thompson 1987). The equity programs set up under the 1986 legislation aimed to ‘eliminate discriminatory employment practices and to promote equal employment opportunities for women’ and this was ‘usually understood to mean that the merit principle forms the basis of appointment to positions and for promotion’ (Burton 1988a, p. 1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses three different ways of applying the single-objective binary genetic algorithm into designing the wind farm. The introduction of different applications is through altering the binary encoding methods in GA codes. The first encoding method is the traditional one with fixed wind turbine positions. The second involves varying the initial positions from results of the first method, and it is achieved by using binary digits to represent the coordination of wind turbine on X or Y axis. The third is the mixing of the first encoding method with another one, which is by adding four more binary digits to represent one of the unavailable plots. The goal of this paper is to demonstrate how the single-objective binary algorithm can be applied and how the wind turbines are distributed under various conditions with best fitness. The main emphasis of discussion is focused on the scenario of wind direction varying from 0° to 45°. Results show that choosing the appropriate position of wind turbines is more significant than choosing the wind turbine numbers, considering that the former has a bigger influence on the whole farm fitness than the latter. And the farm has best performance of fitness values, farm efficiency, and total power with the direction between 20°to 30°.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research used a case study approach to examine curriculum understandings and the processes of curriculum development at a Vietnamese university. The study proposes a participatory model for curriculum development contextualized for Vietnamese higher education. The study found that the curriculum is understood in diverse and sometimes conflicting ways by students, academics and administrative staff, and is developed in a hierarchical manner. Hence, the participatory model incorporates recommendations for effective practices of curriculum development at different levels within Vietnamese universities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the addition of a weighted median Fisher discriminator (WMFD) projection prior to length-normalised Gaussian probabilistic linear discriminant analysis (GPLDA) modelling in order to compensate the additional session variation. In limited microphone data conditions, a linear-weighted approach is introduced to increase the influence of microphone speech dataset. The linear-weighted WMFD-projected GPLDA system shows improvements in EER and DCF values over the pooled LDA- and WMFD-projected GPLDA systems in inter-view-interview condition as WMFD projection extracts more speaker discriminant information with limited number of sessions/ speaker data, and linear-weighted GPLDA approach estimates reliable model parameters with limited microphone data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because brain structure and function are affected in neurological and psychiatric disorders, it is important to disentangle the sources of variation in these phenotypes. Over the past 15 years, twin studies have found evidence for both genetic and environmental influences on neuroimaging phenotypes, but considerable variation across studies makes it difficult to draw clear conclusions about the relative magnitude of these influences. Here we performed the first meta-analysis of structural MRI data from 48 studies on >1,250 twin pairs, and diffusion tensor imaging data from 10 studies on 444 twin pairs. The proportion of total variance accounted for by genes (A), shared environment (C), and unshared environment (E), was calculated by averaging A, C, and E estimates across studies from independent twin cohorts and weighting by sample size. The results indicated that additive genetic estimates were significantly different from zero for all metaanalyzed phenotypes, with the exception of fractional anisotropy (FA) of the callosal splenium, and cortical thickness (CT) of the uncus, left parahippocampal gyrus, and insula. For many phenotypes there was also a significant influence of C. We now have good estimates of heritability for many regional and lobar CT measures, in addition to the global volumes. Confidence intervals are wide and number of individuals small for many of the other phenotypes. In conclusion, while our meta-analysis shows that imaging measures are strongly influenced by genes, and that novel phenotypes such as CT measures, FA measures, and brain activation measures look especially promising, replication across independent samples and demographic groups is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing a school library is a complex, costly and demanding process with important educational and social implications for the whole school community. Drawing upon recent research, this paper presents contrasting snapshots of two school libraries to demonstrate the impacts of greater and lesser collaboration in the designing process. After a brief literature review, the paper outlines the research design (qualitative case study, involving collection and inductive thematic analysis of interview data and student drawings). The select findings highlight the varying experiences of each school’s teacher-librarian through the four designing phases of imagining, transitioning, experiencing and reimagining. Based on the study’s findings, the paper concludes that design outcomes are enhanced through collaboration between professional designers and key school stakeholders including teacher-librarians, teachers, principals and students. The findings and recommendations are of potential interest to teacher-librarians, school principals, education authorities, information professionals and library managers, to guide user-centred library planning and resourcing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need for better and more accurate assessments of testamentary and decision-making capacity grows as Australian society ages and incidences of mentally disabling conditions increase. Capacity is a legal determination, but one on which medical opinion is increasingly being sought. The difficulties inherent within capacity assessments are exacerbated by the ad hoc approaches adopted by legal and medical professionals based on individual knowledge and skill, as well as the numerous assessment paradigms that exist. This can negatively affect the quality of assessments, and results in confusion as to the best way to assess capacity. This article begins by assessing the nature of capacity. The most common general assessment models used in Australia are then discussed, as are the practical challenges associated with capacity assessment. The article concludes by suggesting a way forward to satisfactorily assess legal capacity given the significant ramifications of getting it wrong.