65 resultados para FORMULATIONS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Longitudinal data, where data are repeatedly observed or measured on a temporal basis of time or age provides the foundation of the analysis of processes which evolve over time, and these can be referred to as growth or trajectory models. One of the traditional ways of looking at growth models is to employ either linear or polynomial functional forms to model trajectory shape, and account for variation around an overall mean trend with the inclusion of random eects or individual variation on the functional shape parameters. The identification of distinct subgroups or sub-classes (latent classes) within these trajectory models which are not based on some pre-existing individual classification provides an important methodology with substantive implications. The identification of subgroups or classes has a wide application in the medical arena where responder/non-responder identification based on distinctly diering trajectories delivers further information for clinical processes. This thesis develops Bayesian statistical models and techniques for the identification of subgroups in the analysis of longitudinal data where the number of time intervals is limited. These models are then applied to a single case study which investigates the neuropsychological cognition for early stage breast cancer patients undergoing adjuvant chemotherapy treatment from the Cognition in Breast Cancer Study undertaken by the Wesley Research Institute of Brisbane, Queensland. Alternative formulations to the linear or polynomial approach are taken which use piecewise linear models with a single turning point, change-point or knot at a known time point and latent basis models for the non-linear trajectories found for the verbal memory domain of cognitive function before and after chemotherapy treatment. Hierarchical Bayesian random eects models are used as a starting point for the latent class modelling process and are extended with the incorporation of covariates in the trajectory profiles and as predictors of class membership. The Bayesian latent basis models enable the degree of recovery post-chemotherapy to be estimated for short and long-term followup occasions, and the distinct class trajectories assist in the identification of breast cancer patients who maybe at risk of long-term verbal memory impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the past decade, a significant amount of research has been conducted internationally with the aim of developing, implementing, and verifying "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures. Application of these methods permits comprehensive assessment of the actual failure modes and ultimate strengths of structural systems in practical design situations, without resort to simplified elastic methods of analysis and semi-empirical specification equations. Advanced analysis has the potential to extend the creativity of structural engineers and simplify the design process, while ensuring greater economy and more uniform safety with respect to the ultimate limit state. The application of advanced analysis methods has previously been restricted to steel frames comprising only members with compact cross-sections that are not subject to the effects of local buckling. This precluded the use of advanced analysis from the design of steel frames comprising a significant proportion of the most commonly used Australian sections, which are non-compact and subject to the effects of local buckling. This thesis contains a detailed description of research conducted over the past three years in an attempt to extend the scope of advanced analysis by developing methods that include the effects of local buckling in a non-linear analysis formulation, suitable for practical design of steel frames comprising non-compact sections. Two alternative concentrated plasticity formulations are presented in this thesis: the refined plastic hinge method and the pseudo plastic zone method. Both methods implicitly account for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the methods for the analysis of steel frames comprising non-compact sections has been established by comparison with a comprehensive range of analytical benchmark frame solutions. Both the refined plastic hinge and pseudo plastic zone methods are more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations. For example, the pseudo plastic zone method predicts the ultimate strength of the analytical benchmark frames with an average conservative error of less than one percent, and has an acceptable maximum unconservati_ve error of less than five percent. The pseudo plastic zone model can allow the design capacity to be increased by up to 30 percent for simple frames, mainly due to the consideration of inelastic redistribution. The benefits may be even more significant for complex frames with significant redundancy, which provides greater scope for inelastic redistribution. The analytical benchmark frame solutions were obtained using a distributed plasticity shell finite element model. A detailed description of this model and the results of all the 120 benchmark analyses are provided. The model explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. Its accuracy was verified by comparison with a variety of analytical solutions and the results of three large-scale experimental tests of steel frames comprising non-compact sections. A description of the experimental method and test results is also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Typical daily decision-making process of individuals regarding use of transport system involves mainly three types of decisions: mode choice, departure time choice and route choice. This paper focuses on the mode and departure time choice processes and studies different model specifications for a combined mode and departure time choice model. The paper compares different sets of explanatory variables as well as different model structures to capture the correlation among alternatives and taste variations among the commuters. The main hypothesis tested in this paper is that departure time alternatives are also correlated by the amount of delay. Correlation among different alternatives is confirmed by analyzing different nesting structures as well as error component formulations. Random coefficient logit models confirm the presence of the random taste heterogeneity across commuters. Mixed nested logit models are estimated to jointly account for the random taste heterogeneity and the correlation among different alternatives. Results indicate that accounting for the random taste heterogeneity as well as inter-alternative correlation improves the model performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reviews the main studies on transit users’ route choice in thecontext of transit assignment. The studies are categorized into three groups: static transit assignment, within-day dynamic transit assignment, and emerging approaches. The motivations and behavioural assumptions of these approaches are re-examined. The first group includes shortest-path heuristics in all-or-nothing assignment, random utility maximization route-choice models in stochastic assignment, and user equilibrium based assignment. The second group covers within-day dynamics in transit users’ route choice, transit network formulations, and dynamic transit assignment. The third group introduces the emerging studies on behavioural complexities, day-to-day dynamics, and real-time dynamics in transit users’ route choice. Future research directions are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We assess the performance of an exponential integrator for advancing stiff, semidiscrete formulations of the unsaturated Richards equation in time. The scheme is of second order and explicit in nature but requires the action of the matrix function φ(A) where φ(z) = [exp(z) - 1]/z on a suitability defined vector v at each time step. When the matrix A is large and sparse, φ(A)v can be approximated by Krylov subspace methods that require only matrix-vector products with A. We prove that despite the use of this approximation the scheme remains second order. Furthermore, we provide a practical variable-stepsize implementation of the integrator by deriving an estimate of the local error that requires only a single additional function evaluation. Numerical experiments performed on two-dimensional test problems demonstrate that this implementation outperforms second-order, variable-stepsize implementations of the backward differentiation formulae.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research on multiple kernel learning has lead to a number of approaches for combining kernels in regularized risk minimization. The proposed approaches include different formulations of objectives and varying regularization strategies. In this paper we present a unifying optimization criterion for multiple kernel learning and show how existing formulations are subsumed as special cases. We also derive the criterion’s dual representation, which is suitable for general smooth optimization algorithms. Finally, we evaluate multiple kernel learning in this framework analytically using a Rademacher complexity bound on the generalization error and empirically in a set of experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research on multiple kernel learning has lead to a number of approaches for combining kernels in regularized risk minimization. The proposed approaches include different formulations of objectives and varying regularization strategies. In this paper we present a unifying general optimization criterion for multiple kernel learning and show how existing formulations are subsumed as special cases. We also derive the criterion's dual representation, which is suitable for general smooth optimization algorithms. Finally, we evaluate multiple kernel learning in this framework analytically using a Rademacher complexity bound on the generalization error and empirically in a set of experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Psychoanalysis and related psychodynamic psychotherapies have historically had a limited engagement with substance use and antisocial personality disorders. This in part reflects an early preoccupation with ‘transference neuroses’ and in part reflects later de-emphasis of diagnosis and focus on therapeutic process. Nonetheless, psychoanalytic perspectives can usefully inform thinking about approaches to treatment of such disorders and there are psychoanalytic constructs that have specific relevance to their treatment. This paper reviews some prominent strands of psychoanalytic thinking as they pertain to the treatment of substance abuse and antisocial personality disorders. It is argued that, while Freudian formulations lead to a primarily pessimistic view of the prospect of treatment of such disorders, both the British object relations and the North American self psychology traditions suggest potentially productive approaches. Finally the limited empirical evidence from brief psychodynamically informed treatments of substance use disorders is reviewed. It is concluded that such treatments are not demonstrably effective but that, since no form of psychotherapy has established high efficacy with substance use disorders, brief psychdynamic therapies are not necessarily of lesser value than other treatments and may have specific value for particular individuals and in particular treatment contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The statutory demand procedure has been a part of our corporate law from its earliest modern formulations and it has been suggested, albeit anecdotally, that under the current regime, it gives rise to more litigation than any other part of the Corporations Act. Despite this there has been a lack of consideration of the underlying policy behind the procedure in both the case law and literature; both of which are largely centred on the technical aspects of the process. The purpose of this article is to examine briefly the process of the statutory demand in the context of the current insolvency law in Australia. This paper argues that robust analysis of the statutory demand regime is overdue. The paper first sets out to discover if there is a policy justification for the process and to articulate what that may be. Second, it will briefly examine the current legislation and argue that the structure actually encourages litigation which is arguably undesirable in the context of insolvency. In particular we will ask if the current rigid legal regime is appropriate for dealing efficiently with the highly charged atmosphere of contested insolvency. Third, it will examine suggested reforms in this area as to whether they might be a way forward.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

‘Wearable technology’, or the use of specialist technology in garments, is promoted by the electronics industry as the next frontier of fashion. However the story of wearable technology’s relationship with fashion begins neither with the development of miniaturised computers in the 1970s nor with sophisticated ‘smart textiles’ of the twenty-first century, despite what much of the rhetoric suggests. This study examines wearable technology against a longer history of fashion, highlighted by the influential techno-sartorial experiments of a group of early twentieth century avant-gardes including Italian Futurists Giacomo Balla and F.T. Marinetti, Russian Constructivists Varvara Stepanova and Liubov Popova, and Paris-based Cubist, Sonia Delaunay. Through the interdisciplinary framework of fashion studies, the thesis provides a fuller picture of wearable technology framed by the idea of utopia. Using comparative analysis, and applying the theoretical formulations of Fredric Jameson, Louis Marin and Michael Carter, the thesis traces the appearance of three techno-utopian themes from their origins in the machine age experiments of Balla, Marinetti, Stepanova, Popova and Delaunay to their twenty-first century reappearance in a dozen wearable technology projects. By exploring the central thesis that contemporary wearable technology resurrects the techno-utopian ideas and expressions of the early twentieth century, the study concludes that the abiding utopian impetus to embed technology in the aesthetics (prints, silhouettes, and fabrication) and functionality of fashion is to unify subject, society and environment under a totalising technological order.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pulmonary drug delivery is the focus of much research and development because of its great potential to produce maximum therapeutic benefit. Among the available options the dry powder inhaler (DPI) is the preferred device for the treatment of an increasingly diverse number of diseases. However, as drug delivery from a DPI involves a complicated set of physical processes and the integration of drug formulations, device design and patient usage, the engineering development of this medical technology is proving to be a great challenge. Currently there is large range of devices that are either available on the market or under development, however, none exhibit superior clinical efficacy. A major concern is the inter- and intra-patient variability of the drug dosage delivered to the deep lungs. The extent of variability depends on the drug formulation, the device design and the patient’s inhalation profile. This article reviews recent advances in DPI technology and presents the key factors which motivate and constrain the successful engineering of a universal, patient-independent DPI that is capable of efficient, reliable and repeatable drug delivery. A strong emphasis is placed on the physical processes of drug powder aerosolisation, deagglomeration, and dispersion and on the engineering of formulations and inhalers that can optimise these processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent experience of practice-led postgraduate supervision has prompted me to conclude that the practice-led research method, as it is currently construed, produces good outcomes, especially in permitting practitioners in the creative arts, design and media into the research framework, but at the same time it also generates certain recurring difficulties. What are these difficulties? Practice-led candidates tend to rely on a narrow range of formulations with the result that they assume: (i) the innovative nature of practice-led research; (ii) that its novelty is based in opposition to other research methods; (iii) that practice is intrinsically research, often leading to tautological formulations; (iv) the hyper-self-reflexive nature of practice-led research. This set of guidelines was composed in order to circumvent the shortcomings that result from these recurring formulations. My belief is that, if these shortcomings are avoided, there is nothing to prevent practice-led from further developing as a research inquiry and thus achieving rewarding and successful research outcomes. Originally composed for the purposes of postgraduate supervision, these six rules are presented here in the context of a wider analysis of the emergence of practice-led research and its current conditions of possibility as a research method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article critically analyses the role that criminological theory and specific policy formulations of culture play in the New Zealand state’s response to the over-representation of Māori in the criminal justice system. Part one provides an overview of the changing criminological explanations of, and responses to, Māori offending in New Zealand from the 1980s onwards and how these understandings extended colonialist approaches to Māori and crime into the neo-colonial context. In particular, we chart the shift in policy development from theorising Māori offending as attributable to loss of cultural identity to a focus on socio-economic and institutional antecedents and, finally, through the risk factors, assessment, and criminogenic needs approaches that have gained prominence in the current policy context. In part two, the focus moves to the strategies employed by members of the academy to elevate their own epistemological constructions of Māori social reality within the policy development process. In particular, the critique scrutinises recent attempts to portray Indigenous responses to social harm as “unscientific” and, in part, responsible for the continuing over-representation of Māori in New Zealand’s criminal justice system. The purpose of this analysis is to focus the critical, criminological gaze firmly on the activities of policy makers and administrative criminologists, to examine how their policies and approaches impact on Māori as an Indigenous people.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hydrogels are hydrophilic, three dimensional polymers that imbibe large quantities of water while remaining insoluble in aqueous solutions due to chemical or physical cross-linking. The polymers swell in water or biological fluids, immobilizing the bioactive agent, leading to drug release in a well-defined specific manner. Thus the hydrogels’ elastic properties, swellability and biocompatibility make them excellent formulations for drug delivery. Currently, many drug potencies and therapeutic effects are limited or otherwise reduced because of the partial degradation that occurs before the administered drug reaches the desired site of action. On the other hand, sustained release medications release drugs continually, rather than providing relief of symptoms and protection solely when necessary. In fact, it would be much better if drugs could be administered in a manner that precisely matches physiological needs at desired times and at the desired site (site specific targeting). There is therefore an unmet need to develop controlled drug delivery systems especially for delivery of peptide and protein bound drugs. The purpose of this project is to produce hydrogels for structural drug delivery and time-dependent sustained release of drugs (bioactive agents). We use an innovative polymerisation strategy based on native chemical ligation (NCL) to covalently cross-link polymers to form hydrogels. When mixed in aqueous solution, four armed (polyethylene glycol) amine (PEG-4A) end functionalised with thioester and four branched Nterminal cysteine peptide dendrimers spontaneously conjugated to produce biomimetic hydrogels. These hydrogels showed superior resistance to shear stress compared to an equivalent PEG macromonomer system and were shown to be proteolytically degradable with concomitant release of a model payload molecule. This is the first report of a peptide dendrimers/PEG macromonomer approach to hydrogel production and opens up the prospect of facile hydrogel synthesis together with tailored payload release.