957 resultados para Many body perturbation theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates the phenomenon of self-harm as a form of political protest using two different, but complementary, methods of inquiry: a theoretical research project and a novel. Through these two approaches, to the same research problem, I examine how we can re-position the body that self-harms in political protest from weapon to voice; and in doing so find a path towards ethical and equitable dialogue between marginalised and mainstream communities. The theoretical, or academic, portion of the thesis examines self-harm as protest, positing these acts as a form of tactical selfharm, and acknowledge its emergence as a voice for the otherwise silenced in the public sphere. Through the use of phenomenology and feminist theory I examine the body as site for political agency, the circumstances which surround the use of the body for protest, and the reaction to tactical self-harm by the individual and the state. Using Bakhtin’s concept of dialogism, and the dialogic space I propose that by ‘hearing’ the body engaged in tactical selfharm we come closer to entering into an ethical dialogue with the otherwise silenced in our communities (locally, nationally and globally). The novel, Imperfect Offerings, explores these ideas in a fictional world, and allows me to put faces, names and lives to those who are compelled to harm their bodies to be heard. Also using Bakhtin’s framework I encourage a dialogue between the critical and creative parts of the thesis, challenging the traditional paradigm of creative PhD projects as creative work and exegesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on the varying approaches and methodologies adopted when the calculation of holding costs is undertaken, focusing on greenfield development. Whilst acknowledging there may be some consistency in embracing first principles relating to holding cost theory, a review of the literature reveals considerable lack of uniformity in this regard. There is even less clarity in quantitative determination, especially in Australia where there has been only limited empirical analysis undertaken. Despite a growing quantum of research undertaken in relation to various elements connected with housing affordability, the matter of holding costs has not been well addressed regardless of its part in the highly prioritised Australian Government’s housing research agenda. The end result has been a modicum of qualitative commentary relating to holding costs. There have been few attempts at finer-tuned analysis that exposes a quantified level of holding cost calculated with underlying rigour. Holding costs can take many forms, but they inevitably involve the computation of “carrying costs” of an initial outlay that has yet to fully realise its ultimate yield. Although sometimes considered a “hidden” cost, it is submitted that holding costs prospectively represent a major determinate of value. If this is the case, then considered in the context of housing affordability, it is therefore potentially pervasive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an automated procedure for analysing the significance of each of the many terms in the equations of motion for a serial-link robot manipulator. Significance analysis provides insight into the rigid-body dynamic effects that are significant locally or globally in the manipulator's state space. Deleting those terms that do not contribute significantly to the total joint torque can greatly reduce the computational burden for online control, and a Monte-Carlo style simulation is used to investigate the errors thus introduced. The procedures described are a hybrid of symbolic and numeric techniques, and can be readily implemented using standard computer algebra packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reliability of Critical Infrastructure is considered to be a fundamental expectation of modern societies. These large-scale socio-technical systems have always, due to their complex nature, been faced with threats challenging their ongoing functioning. However, increasing uncertainty in addition to the trend of infrastructure fragmentation has made reliable service provision not only a key organisational goal, but a major continuity challenge: especially given the highly interdependent network conditions that exist both regionally and globally. The notion of resilience as an adaptive capacity supporting infrastructure reliability under conditions of uncertainty and change has emerged as a critical capacity for systems of infrastructure and the organisations responsible for their reliable management. This study explores infrastructure reliability through the lens of resilience from an organisation and system perspective using two recognised resilience-enhancing management practices, High Reliability Theory (HRT) and Business Continuity Management (BCM) to better understand how this phenomenon manifests within a partially fragmented (corporatised) critical infrastructure industry – The Queensland Electricity Industry. The methodological approach involved a single case study design (industry) with embedded sub-units of analysis (organisations), utilising in-depth interviews and document analysis to illicit findings. Derived from detailed assessment of BCM and Reliability-Enhancing characteristics, findings suggest that the industry as a whole exhibits resilient functioning, however this was found to manifest at different levels across the industry and in different combinations. Whilst there were distinct differences in respect to resilient capabilities at the organisational level, differences were less marked at a systems (industry) level, with many common understandings carried over from the pre-corporatised operating environment. These Heritage Factors were central to understanding the systems level cohesion noted in the work. The findings of this study are intended to contribute to a body of knowledge encompassing resilience and high reliability in critical infrastructure industries. The research also has value from a practical perspective, as it suggests a range of opportunities to enhance resilient functioning under increasingly interdependent, networked conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If one clear argument emerged from my doctoral thesis in political science, it is that there is no agreement as to what democracy is. There are over 40 different varieties of democracy ranging from those in the mainstream with subtle or minute differences to those playing by themselves in the corner. And many of these various types of democracy are very well argued, empirically supported, and highly relevant to certain polities. The irony is that the thing which all of these democratic varieties or the ‘basic democracy’ that all other forms of democracy stem from, is elusive. There is no international agreement in the literature or in political practice as to what ‘basic democracy’ is and that is problematic as many of us use the word ‘democracy’ every day and it is a concept of tremendous importance internationally. I am still uncertain as to why this problem has not been resolved before by far greater minds than my own, and it may have something to do with the recent growth in democratic theory this past decade and the innovative areas of thought my thesis required, but I think I’ve got the answer. By listing each type of democracy and filling the column next to this list with the literature associated with these various styles of democracy, I amassed a large and comprehensive body of textual data. My research intended to find out what these various styles of democracy had in common and to create a taxonomy (like the ‘tree of life’ in biology) of democracy to attempt at showing how various styles of democracy have ‘evolved’ over the past 5000 years.ii I then ran a word frequency analysis program or a piece of software that counts the 100 most commonly used words in the texts. This is where my logic came in as I had to make sense of these words. How did they answer what the most fundamental commonalities are between 40 different styles of democracy? I used a grounded theory analysis which required that I argue my way through these words to form a ‘theory’ or plausible explanation as to why these particular words and not others are the important ones for answering the question. It came down to the argument that all 40 styles of democracy analysed have the following in common 1) A concept of a citizenry. 2) A concept of sovereignty. 3) A concept of equality. 4) A concept of law. 5) A concept of communication. 6) And a concept of selecting officials. Thus, democracy is a defined citizenry with its own concept of sovereignty which it exercises through the institutions which support the citizenry’s understandings of equality, law, communication, and the selection of officials. Once any of these 6 concepts are defined in a particular way it creates a style of democracy. From this, we can also see that there can be more than one style of democracy active in a particular government as a citizenry is composed of many different aggregates with their own understandings of the six concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is a digital version of a dissertation that was first submitted in partial fulfillment of the Degree of Doctor of Philosophy at the Queensland University of Technology (QUT) in March 1994. The work was concerned with problems of self-organisation and organisation ranging from local to global levels of hierarchy. It considers organisations as living entities from local to global things that a living entity – more particularly, an individual, a body corporate or a body politic - must know and do to maintain an existence – that is to remain viable – or to be sustainable. The term ‘land management’ as used in 1994 was later subsumed into a more general concept of ‘natural resource management’ and then merged with ideas about sustainable socioeconomic and sustainable ecological development. The cybernetic approach contains many cognitive elements of human observation, language and learning that combine into production processes. The approach tends to highlight instances where systems (or organisations) can fail because they have very little chance of succeeding. Thus there are logical necessities as well as technical possibilities in designing, constructing, operating and maintaining production systems that function reliably over extended periods. Chapter numbers and titles to the original thesis are as follows: 1. Land management as a problem of coping with complexity 2. Background theory in systems theory and cybernetic principles 3. Operationalisation of cybernetic principles in Beer’s Viable System Model 4. Issues in the design of viable cadastral surveying and mapping organisation 5. An analysis of the tendency for fragmentation in surveying and mapping organisation 6. Perambulating the boundaries of Sydney – a problem of social control under poor standards of literacy 7. Cybernetic principles in the process of legislation 8. Closer settlement policy and viability in agricultural production 9. Rate of return in leasing Crown lands

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Impairments in upper-body function (UBF) are common following breast cancer. However, the relationship between arm morbidity and quality of life (QoL) remains unclear. This investigation uses longitudinal data to describe UBF in a population-based sample of women with breast cancer and examines its relationship with QoL. ---------- Methods: Australian women (n = 287) with unilateral breast cancer were assessed at three-monthly intervals, from six- to 18-months post-surgery (PS). Strength, endurance and flexibility were used to assess objective UBF, while the Disability of the Arm, Shoulder and Hand questionnaire and the Functional Assessment of Cancer Therapy- Breast questionnaire were used to assess self-reported UBF and QoL, respectively. ---------- Results: Although mean UBF improved over time, up to 41% of women revealed declines in UBF between sixand 18-months PS. Older age, lower socioeconomic position, treatment on the dominant side, mastectomy, more extensive lymph node removal and having lymphoedema each increased odds of declines in UBF by at least twofold (p < 0.05). Lower baseline and declines in perceived UBF between six- and 18-months PS were each associated with poorer QoL at 18-months PS (p < 0.05). ---------- Conclusions: Significant upper-body morbidity is experienced by many following breast cancer treatment, persisting longer term, and adversely influencing the QoL of breast cancer survivors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for the development of effective business curricula that meets the needs of the marketplace has created an increase in the adoption of core competencies lists identifying appropriate graduate skills. Many organisations and tertiary institutions have individual graduate capabilities lists including skills deemed essential for success. Skills recognised as ‘critical thinking’ are popular inclusions on core competencies and graduate capability lists. While there is literature outlining ‘critical thinking’ frameworks, methods of teaching it and calls for its integration into business curricula, few studies actually identify quantifiable improvements achieved in this area. This project sought to address the development of ‘critical thinking’ skills in a management degree program by embedding a process for critical thinking within a theory unit undertaken by students early in the program. Focus groups and a student survey were used to identify issues of both content and implementation and to develop a student perspective on their needs in thinking critically. A process utilising a framework of critical thinking was integrated through a workbook of weekly case studies for group analysis, discussions and experiential exercises. The experience included formative and summative assessment. Initial results indicate a greater valuation by students of their experience in the organisation theory unit; better marks for mid semester essay assignments and higher evaluations on the university administered survey of students’ satisfaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes a discrete component of a larger mixed-method (survey and interview) study that explored the health-promotion and risk-reduction practices of younger premenopausal survivors of ovarian, breast and haematological cancers. This thesis outlines my distinct contribution to the larger study, which was to: (1) Produce a literature review that thoroughly explored all longer-term breast cancer treatment outcomes, and which outlined the health risks to survivors associated with these; (2) Describe and analyse the health-promotion and risk-reduction behaviours of nine younger female survivors of breast cancer as articulated in the qualitative interview dataset; and (3) Test the explanatory power of the Precede-Proceed theoretical framework underpinning the study in relation to the qualitative data from the breast cancer cohort. The thesis reveals that breast cancer survivors experienced many adverse outcomes as a result of treatment. While they generally engaged in healthy lifestyle practices, a lack of knowledge about many recommended health behaviours emerged throughout the interviews. The participants also described significant internal and external pressures to behave in certain ways because of the social norms surrounding the disease. This thesis also reports that the Precede-Proceed model is a generally robust approach to data collection, analysis and interpretation in the context of breast cancer survivorship. It provided plausible explanations for much of the data in this study. However, profound sociological and psychological implications arose during the analysis that were not effectively captured or explained by the theories underpinning the model. A sociological filter—such as Turner’s explanation of the meaning of the body and embodiment in the social sphere (Turner, 2008)—and the psychological concerns teased out in Mishel’s (1990) Uncertainty in Illness Theory, provided a useful dimension to the findings generated through the Precede-Proceed model. The thesis concludes with several recommendations for future research, clinical practice and education in this context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The changes of economic status in Malaysia have lead to many psychosocial problems especially among the young people. Counselling and psychotherapy have been seen as one of the solutions that are practiced in Western Culture. Most counselling theorists believe that their theory is universal however there is limited research to prove it. This paper will describe an ongoing study conducted in Malaysia about the applicability of one Western counselling Theory, Bowen’s family theory the Differentiation of self levels in the family allow a person to both leave the family’s boundaries in search of uniqueness and continually return to the family in order to further establish a sense of belonging. In addition Bowen believed that this comprised of four measures: Differentiation of Self (DSI), Family Inventory of Live Event (ILE), Depression Anxiety and Stress Scale (DASS) and Connor-Davidson Resilience Scale (CD-RISC). Preliminary findings are discussed and the implication in enhancing the quality of teaching family counselling in universities explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this article is to examine how a consumer’s weight control beliefs (WCB), a female advertising model’s body size (slim or large) and product type influence consumer evaluations and consumer body perceptions. The study uses an experiment of 371 consumers. The design of the experiment was a 2 (weight control belief: internal, external) X 2 (model size: larger sized, slim) X 2 (product type: weight controlling, non-weight controlling) between-participants factorial design. Results reveal two key contributions. First, larger sized models result in consumers feeling less pressure from society to be thin, viewing their actual shape as slimmer relative to viewing a slim model and wanting a thinner ideal body shape. Slim models result in the opposite effects. Second this research reveals a boundary condition for the extent to which endorser–product congruency theory can be generalized to endorsers of a larger body size. Results indicate that consumer WCB may be a useful variable to consider when marketers consider the use of larger models in advertising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Under certain circumstances, an industrial hopper which operates under the "funnel-flow" regime can be converted to the "mass-flow" regime with the addition of a flow-corrective insert. This paper is concerned with calculating granular flow patterns near the outlet of hoppers that incorporate a particular type of insert, the cone-in-cone insert. The flow is considered to be quasi-static, and governed by the Coulomb-Mohr yield condition together with the non-dilatant double-shearing theory. In two dimensions, the hoppers are wedge-shaped, and as such the formulation for the wedge-in-wedge hopper also includes the case of asymmetrical hoppers. A perturbation approach, valid for high angles of internal friction, is used for both two-dimensional and axially symmetric flows, with analytic results possible for both leading order and correction terms. This perturbation scheme is compared with numerical solutions to the governing equations, and is shown to work very well for angles of internal friction in excess of 45 degree.