912 resultados para Lagrange multiplier principle
Resumo:
The osteochondral defect is a classical model for a multiple-tissue problem[1]. Tissue engineering of either bone or cartilage imposes different demands on a scaffold concerning porosity, pore size and interconnectivity. Furthermore, local release of tissue-specific growth factors necessitates a tailored architecture. For the fabrication of an osteochondral scaffold with region specific architecture, an advanced technique is required. Stereolithography is a rapid prototyping technique that allows for the creation of such 3D polymer objects with well-defined architecture. Its working principle is the partial irradiation of a resin, causing a liquid-solid transition. By irradiating this resin by a computer-driven light source, a solid 3D object is constructed layer by layer. To make biodegradable polymers applicable in stereolithography, low-molecular weight polymers have to be functionalised with double bonds to enable photo-initiated crosslinking.
Resumo:
Immediate indefeasibility is the cornerstone of the Torrens system of land registration. However, when combined with the apparent ease in which forged mortgages become registered, the operation of this doctrine can come into question. This article seeks to argue that, rather than question indefeasibility, the focus should be on the verification of identity of parties to land transactions. Whilst no system can ever be infallible, it is suggested that by correctly imposing the responsibility for identity verification on the appropriate individual, the Torrens system can retain immediate indefeasibility as its paramount principle, yet achieve the optimum level of fairness in terms of allocation of responsibility and loss. With the dawn of a new era of electronic conveyancing about to begin, the framework suggested here provides a model for minimising the risks of forged mortgages and appropriately allocating the loss.
Resumo:
Purpose To identify the challenges faced by local government in Indonesia when adopting a Public Asset Management Framework. Design A Case Study in South Sulawesi Provincial Government was used as the approach to achieving the research objective. The case study involved two data collection techniques - interviews and document analysis. Findings The result of the study indicates there are significant challenges that the Indonesian local government need to manage when adopting a public asset management framework. Those challenges are: absence of an institutional and legal framework to support the asset management application; non-profit principle of public assets; multiple jurisdictions involved in the public asset management processes; the complexity of local government objectives; unavailability of data for managing public property; and limited human resources. Research Limitation This research is limited to one case study. It is a preliminary study from larger research that uses multiple case studies. The main research also investigates opportunities for local government by adopting and implementing public asset management. Originality/Value Findings from this study provide useful input for the policy makers, academics and asset management practitioners in Indonesia to establish a public asset management framework resulting in efficient and effective organizations, as well as an increase of public services quality. This study has a potential application for other developing countries.
Resumo:
The principle of autonomy underpins legal regulation of advance directives that refuse life-sustaining medical treatment. The primacy of autonomy in this domain is recognised expressly in the case law, through judicial pronouncement, and implicitly in most Australian jurisdictions, through enactment into statute of the right to make an advance directive. This article seeks to justify autonomy as an appropriate principle for regulating advance directives and relies on three arguments: the necessity of autonomy in a liberal democracy; the primacy of autonomy in medical ethics discourse; and the uncontested importance of autonomy in the law on contemporaneous refusal of medical treatment. This article also responds to key criticisms that autonomy is not an appropriate organising principle to underpin legal regulation of advance directives.
Resumo:
Carbon capture and storage (CCS) is considered to be an integral transitionary measure in the mitigation of the global greenhouse gas emissions from our continued use of fossil fuels. Regulatory frameworks have been developed around the world and pilot projects have been commenced. However, CCS processes are largely untested at commercial scales and there are many unknowns associated with the long terms risks from these storage projects. Governments, including Australia, are struggling to develop appropriate, yet commercially viable, regulatory approaches to manage the uncertain long term risks of CCS activities. There have been numerous CCS regimes passed at the Federal, State and Territory levels in Australia. All adopt a different approach to the delicate balance facilitating projects and managing risk. This paper will examine the relatively new onshore and offshore regimes for CCS in Australia and the legal issues arising in relation to the implementation of CCS projects. Comparisons will be made with the EU CCS Directive where appropriate.
Resumo:
The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.
Resumo:
Over the last three years, in our Early Algebra Thinking Project, we have been studying Years 3 to 5 students’ ability to generalise in a variety of situations, namely, compensation principles in computation, the balance principle in equivalence and equations, change and inverse change rules with function machines, and pattern rules with growing patterns. In these studies, we have attempted to involve a variety of models and representations and to build students’ abilities to switch between them (in line with the theories of Dreyfus, 1991, and Duval, 1999). The results have shown the negative effect of closure on generalisation in symbolic representations, the predominance of single variance generalisation over covariant generalisation in tabular representations, and the reduced ability to readily identify commonalities and relationships in enactive and iconic representations. This chapter uses the results to explore the interrelation between generalisation and verbal and visual comprehension of context. The studies evidence the importance of understanding and communicating aspects of representational forms which allowed commonalities to be seen across or between representations. Finally the chapter explores the implications of the studies for a theory that describes a growth in integration of models and representations that leads to generalisation.
Resumo:
In this paper we consider the implementation of time and energy efficient trajectories onto a test-bed autonomous underwater vehicle. The trajectories are losely connected to the results of the application of the maximum principle to the controlled mechanical system. We use a numerical algorithm to compute efficient trajectories designed using geometric control theory to optimize a given cost function. Experimental results are shown for the time minimization problem.
Resumo:
From Pontryagin’s Maximum Principle to the Duke Kahanamoku Aquatic Complex; we develop the theory and generate implementable time efficient trajectories for a test-bed autonomous underwater vehicle (AUV). This paper is the beginning of the journey from theory to implementation. We begin by considering pure motion trajectories and move into a rectangular trajectory which is a concatenation of pure surge and pure sway. These trajectories are tested using our numerical model and demonstrated by our AUV in the pool. In this paper we demonstrate that the above motions are realizable through our method, and we gain confidence in our numerical model. We conclude that using our current techniques, implementation of time efficient trajectories is likely to succeed.
Resumo:
Generative systems are now being proposed for addressing major ecological problems. The Complex Urban Systems Project (CUSP) founded in 2008 at the Queensland University of Technology, emphasises the ecological significance of the generative global networking of urban environments. It argues that the natural planetary systems for balancing global ecology are no longer able to respond sufficiently rapidly to the ecological damage caused by humankind and by dense urban conurbations in particular as evidenced by impacts such as climate change. The proposal of this research project is to provide a high speed generative nervous system for the planet by connecting major cities globally to interact directly with natural ecosystems to engender rapid ecological response. This would be achieved by active interactions of the global urban network with the natural ecosystem in the ecological principle of entropy. The key goal is to achieve ecologically positive cities by activating self-organising cities capable of full integration into natural eco-systems and to netowork the cities globally to provide the planet with a nervous system.
Resumo:
Dr. Richard Shapcott is the senior lecturer in International Relations at the University of Queensland. His areas of interest in research concern international ethics, cosmopolitan political theory and cultural diversity. He is the author of the recently published book titled International Ethics: A Critical Introduction; and several other pieces, such as, “Anti-Cosmopolitanism, the Cosmopolitan Harm Principle and Global Dialogue,” in Michalis’ and Petito’s book, Civilizational Dialogue and World Order. He’s also the author of “Dialogue and International Ethics: Religion, Cultural Diversity and Universalism, in Patrick Hayden’s, The Ashgate Research Companion to Ethics and International Relations.
Resumo:
Assessment Principle -- -- The capstone experience should include assessment that: 1: Enables students to apply their knowledge skills and capabilities in an authentic context ; 2: Tests whether or not students are able to apply knowledge skills and capabilities in unfamiliar contexts ; 3: Incorporates feedback from a multitude of sources including peers and self‐reflection to enable students to become self‐reliant and to exercise their own professional judgment ; 4: Recognises the culminating nature of the capstone experience.
Resumo:
In Cook v Cook the Australian High Court held that the standard of reasonable care owed by a learner driver to an instructor, conscious of the driver’s lack of experience, was lower than that owed to other passengers and road users. Recently, in Imbree v McNeilly, the High Court declined to follow this principle, concluding that the driver’s status or relationship with the claimant should no longer influence or alter the standard of care owed. The decision therefore provides an opportunity to re-examine the rationale and policy behind current jurisprudence governing the standard of care owed by learner drivers. In doing so, this article considers the principles relevant to determining the standard and Imbree’s implications for other areas of tort law and claimant v defendant relationships. It argues that Imbree was influenced by changing judicial perceptions concerning the vulnerability of driving instructors and the relevance of insurance to tortious liability.
Resumo:
Planning on utilization of train-set is one of the key tasks of transport organization for passenger dedicated railway in China. It also has strong relationships with timetable scheduling and operation plans at a station. To execute such a task in a railway hub pooling multiple railway lines, the characteristics of multiple routing for train-set is discussed in term of semicircle of train-sets' turnover. In programming the described problem, the minimum dwell time is selected as the objectives with special derive constraints of the train-set's dispatch, the connecting conditions, the principle of uniqueness for train-sets, and the first plus for connection in the same direction based on time tolerance σ. A compact connection algorithm based on time tolerance is then designed. The feasibility of the model and the algorithm is proved by the case study. The result indicates that the circulation model and algorithm about multiple routing can deal with the connections between the train-sets of multiple directions, and reduce the train's pulling in or leaving impact on the station's throat.
Resumo:
Gradual authentication is a principle proposed by Meadows as a way to tackle denial-of-service attacks on network protocols by gradually increasing the confidence in clients before the server commits resources. In this paper, we propose an efficient method that allows a defending server to authenticate its clients gradually with the help of some fast-to-verify measures. Our method integrates hash-based client puzzles along with a special class of digital signatures supporting fast verification. Our hash-based client puzzle provides finer granularity of difficulty and is proven secure in the puzzle difficulty model of Chen et al. (2009). We integrate this with the fast-verification digital signature scheme proposed by Bernstein (2000, 2008). These schemes can be up to 20 times faster for client authentication compared to RSA-based schemes. Our experimental results show that, in the Secure Sockets Layer (SSL) protocol, fast verification digital signatures can provide a 7% increase in connections per second compared to RSA signatures, and our integration of client puzzles with client authentication imposes no performance penalty on the server since puzzle verification is a part of signature verification.