383 resultados para INVARIANCE-PRINCIPLE
Resumo:
In Chapters 1 through 9 of the book (with the exception of a brief discussion on observers and integral action in Section 5.5 of Chapter 5) we considered constrained optimal control problems for systems without uncertainty, that is, with no unmodelled dynamics or disturbances, and where the full state was available for measurement. More realistically, however, it is necessary to consider control problems for systems with uncertainty. This chapter addresses some of the issues that arise in this situation. As in Chapter 9, we adopt a stochastic description of uncertainty, which associates probability distributions to the uncertain elements, that is, disturbances and initial conditions. (See Section 12.6 for references to alternative approaches to model uncertainty.) When incomplete state information exists, a popular observer-based control strategy in the presence of stochastic disturbances is to use the certainty equivalence [CE] principle, introduced in Section 5.5 of Chapter 5 for deterministic systems. In the stochastic framework, CE consists of estimating the state and then using these estimates as if they were the true state in the control law that results if the problem were formulated as a deterministic problem (that is, without uncertainty). This strategy is motivated by the unconstrained problem with a quadratic objective function, for which CE is indeed the optimal solution (˚Astr¨om 1970, Bertsekas 1976). One of the aims of this chapter is to explore the issues that arise from the use of CE in RHC in the presence of constraints. We then turn to the obvious question about the optimality of the CE principle. We show that CE is, indeed, not optimal in general. We also analyse the possibility of obtaining truly optimal solutions for single input linear systems with input constraints and uncertainty related to output feedback and stochastic disturbances.We first find the optimal solution for the case of horizon N = 1, and then we indicate the complications that arise in the case of horizon N = 2. Our conclusion is that, for the case of linear constrained systems, the extra effort involved in the optimal feedback policy is probably not justified in practice. Indeed, we show by example that CE can give near optimal performance. We thus advocate this approach in real applications.
Resumo:
Traction force microscopy (TFM) is commonly used to estimate cells’ traction forces from the deformation that they cause on their substrate. The accuracy of TFM highly depends on the computational methods used to measure the deformation of the substrate and estimate the forces, and also on the specifics of the experimental set-up. Computer simulations can be used to evaluate the effect of both the computational methods and the experimental set-up without the need to perform numerous experiments. Here, we present one such TFM simulator that addresses several limitations of the existing ones. As a proof of principle, we recreate a TFM experimental set-up, and apply a classic 2D TFM algorithm to recover the forces. In summary, our simulator provides a valuable tool to study the performance, refine experimentally, and guide the extraction of biological conclusions from TFM experiments.
Resumo:
The role of law in managing public health challenges such as influenza pandemics poses special challenges. This article reviews Australian plans in the context of the H1N1 09 experience to assess whether risk management was facilitated or inhibited by the "number" of levels or phases of management, the degree of prescriptive detail for particular phases, the number of plans, the clarity of the relationship between them, and the role of the media. Despite differences in the content and form of the plans at the time of the H1N1 09 emerging pandemic, the article argues that in practice, the plans proved to be responsive and robust bases for managing pandemic risks. It is suggested that this was because the plans proved to be frameworks for coordination rather than prescriptive straitjackets, to be only one component of the regulatory response, and to offer the varied tool box of possible responses, as called for by the theory of responsive regulation. Consistent with the principle of subsidiarity, it is argued that the plans did not inhibit localised responses such as selective school closures or rapid responses to selected populations such as cruise ship passengers.
Resumo:
In the finite element modelling of steel frames, external loads usually act along the members rather than at the nodes only. Conventionally, when a member is subjected to these transverse loads, they are converted to nodal forces which act at the ends of the elements into which the member is discretised by either lumping or consistent nodal load approaches. For a contemporary geometrically non-linear analysis in which the axial force in the member is large, accurate solutions are achieved by discretising the member into many elements, which can produce unfavourable consequences on the efficacy of the method for analysing large steel frames. Herein, a numerical technique to include the transverse loading in the non-linear stiffness formulation for a single element is proposed, and which is able to predict the structural responses of steel frames involving the effects of first-order member loads as well as the second-order coupling effect between the transverse load and the axial force in the member. This allows for a minimal discretisation of a frame for second-order analysis. For those conventional analyses which do include transverse member loading, prescribed stiffness matrices must be used for the plethora of specific loading patterns encountered. This paper shows, however, that the principle of superposition can be applied to the equilibrium condition, so that the form of the stiffness matrix remains unchanged with only the magnitude of the loading being needed to be changed in the stiffness formulation. This novelty allows for a very useful generalised stiffness formulation for a single higher-order element with arbitrary transverse loading patterns to be formulated. The results are verified using analytical stability function studies, as well as with numerical results reported by independent researchers on several simple structural frames.
Resumo:
Recently, the capture and storage of CO2 have attracted research interest as a strategy to reduce the global emissions of greenhouse gases. It is crucial to find suitable materials to achieve an efficient CO2 capture. Here we report our study of CO2 adsorption on boron-doped C60 fullerene in the neutral state and in the 1e−-charged state. We use first principle density functional calculations to simulate the CO2 adsorption. The results show that CO2 can form weak interactions with the BC59 cage in its neutral state and the interactions can be enhanced significantly by introducing an extra electron to the system.
Resumo:
In policy terms, community media are known as the “third sector” of the media. The description reflects the historical expectation that community media can fulfill a need not met by the commercial and public service broadcasters. A defining element of this “need” has been the means to production for nonprofessionals, particularly groups not represented in the mainstream media. The historical construction of community media reveals production to be a guiding principle; both a means and an end in itself. This chapter examines the various rationales underpinning community media production, including empowerment, media diversity, and the independent producer movement. Using case studies from youth media, the chapter critiques producer-centric models of community media. In the contemporary media environment, production alone cannot meet the social needs that community media were established to address. Instead, I propose a rationale that combines both production and consumption ethics.
Resumo:
The assumptions underlying the Probability Ranking Principle (PRP) have led to a number of alternative approaches that cater or compensate for the PRP’s limitations. All alternatives deviate from the PRP by incorporating dependencies. This results in a re-ranking that promotes or demotes documents depending upon their relationship with the documents that have been already ranked. In this paper, we compare and contrast the behaviour of state-of-the-art ranking strategies and principles. To do so, we tease out analytical relationships between the ranking approaches and we investigate the document kinematics to visualise the effects of the different approaches on document ranking.
Resumo:
In this paper we describe the approaches adopted to generate the runs submitted to ImageCLEFPhoto 2009 with an aim to promote document diversity in the rankings. Four of our runs are text based approaches that employ textual statistics extracted from the captions of images, i.e. MMR [1] as a state of the art method for result diversification, two approaches that combine relevance information and clustering techniques, and an instantiation of Quantum Probability Ranking Principle. The fifth run exploits visual features of the provided images to re-rank the initial results by means of Factor Analysis. The results reveal that our methods based on only text captions consistently improve the performance of the respective baselines, while the approach that combines visual features with textual statistics shows lower levels of improvements.
Resumo:
Introduction Natural product provenance is important in the food, beverage and pharmaceutical industries, for consumer confidence and with health implications. Raman spectroscopy has powerful molecular fingerprint abilities. Surface Enhanced Raman Spectroscopy’s (SERS) sharp peaks allow distinction between minimally different molecules, so it should be suitable for this purpose. Methods Naturally caffeinated beverages with Guarana extract, coffee and Red Bull energy drink as a synthetic caffeinated beverage for comparison (20 µL ea.) were reacted 1:1 with Gold nanoparticles functionalised with anti-caffeine antibody (ab15221) (10 minutes), air dried and analysed in a micro-Raman instrument. The spectral data was processed using Principle Component Analysis (PCA). Results The PCA showed Guarana sourced caffeine varied significantly from synthetic caffeine (Red Bull) on component 1 (containing 76.4% of the variance in the data). See figure 1. The coffee containing beverages, and in particular Robert Timms (instant coffee) were very similar on component 1, but the barista espresso showed minor variance on component 1. Both coffee sourced caffeine samples varied with red Bull on component 2, (20% of variance). ************************************************************ Figure 1 PCA comparing a naturally caffeinated beverage containing Guarana with coffee. ************************************************************ Discussion PCA is an unsupervised multivariate statistical method that determines patterns within data. Figure 1 shows Caffeine in Guarana is notably different to synthetic caffeine. Other researchers have revealed that caffeine in Guarana plants is complexed with tannins. Naturally sourced/ lightly processed caffeine (Monster Energy, Espresso) are more inherently different than synthetic (Red Bull) /highly processed (Robert Timms) caffeine, in figure 1, which is consistent with this finding and demonstrates this technique’s applicability. Guarana provenance is important because it is still largely hand produced and its demand is escalating with recognition of its benefits. This could be a powerful technique for Guarana provenance, and may extend to other industries where provenance / authentication are required, e.g. the wine or natural pharmaceuticals industries.
Resumo:
Tissue engineering of vascularized constructs has great utility in reconstructive surgery. While we have been successful in generating vascularized granulation-like tissue and adipose tissue in an in vivo tissue engineering chamber, production of other differentiated tissues in a stable construct remains a challenge. One approach is to utilize potent differentiation factors, which can influence the base tissue. Endothelial precursor cells (EPCs) have the ability to both carry differentiation factors and home to developing vasculature. In this study, proof-of-principle experiments demonstrate that such cells can be recruited from the circulation into an in vivo tissue engineering chamber. CXC chemokine ligand 12 (CXCL12)/stromal cell-derived factor 1 was infused into the chamber through Alzet osmotic pumps and chamber cannulation between days 0 and 7, and facilitated recruitment of systemically inoculated exogenous human EPCs injected on day 6. CXCL12 infusion resulted in an eightfold increase in EPC recruitment, 2 (p = 0.03) and 7 days postinfusion (p = 0.008). Delivery of chemotactic/proliferation and/or differentiation factors and appropriately timed introduction of effective cells may allow us to better exploit the regenerative potential of the established chamber construct. © Copyright 2009, Mary Ann Liebert, Inc. 2009.
Resumo:
This paper presents Sequence Matching Across Route Traversals (SMART); a generally applicable sequence-based place recognition algorithm. SMART provides invariance to changes in illumination and vehicle speed while also providing moderate pose invariance and robustness to environmental aliasing. We evaluate SMART on vehicles travelling at highly variable speeds in two challenging environments; firstly, on an all-terrain vehicle in an off-road, forest track and secondly, using a passenger car traversing an urban environment across day and night. We provide comparative results to the current state-of-the-art SeqSLAM algorithm and investigate the effects of altering SMART’s image matching parameters. Additionally, we conduct an extensive study of the relationship between image sequence length and SMART’s matching performance. Our results show viable place recognition performance in both environments with short 10-metre sequences, and up to 96% recall at 100% precision across extreme day-night cycles when longer image sequences are used.
Resumo:
In 2005, governments around the world unanimously agreed to the principle of the responsibility to protect (R2P), which holds that all states have a responsibility to protect their populations from genocide and mass atrocities, that the international community should assist them to fulfil this duty, and that the international community should take timely and decisive measures to protect populations from such crimes when their host state fails to do so. Progressing R2P from words to deeds requires international consensus about the principle’s meaning and scope. To achieve a global consensus on this, we need to better understand the position of governments around the world, including in the Asia-Pacific region, which has long been associated with an enduring commitment to a traditional concept of sovereignty. The present article contributes to such an endeavour through its three sections. The first part charts the nature of the international consensus on R2P and examines the UN secretary-general’s approach. The second looks in detail at the positions of the Asia-Pacific region’s governments on the R2P principle. The final part explores the way forward for progressing the R2P principle in the Asia-Pacific region.
Resumo:
The International Law Commission (ILC) study on the protection of persons in the event of disasters has been ongoing since 2006. During this period, there has been continuous debate in the literature and in consultations with States as to whether the study should explore the Responsibility to Protect (R2P) persons in the event of natural disasters. In this article, the rationale for this continuing argument is explored considering that the ILC has repeatedly stated since 2008 that the study’s topic – assistance in the event of natural disasters – has no legal relationship with the R2P principle. In the final section it is proposed that the real knowledge gap in the ILC discussion and study is the positive affirmation of the rights of those most affected by natural disasters – women.
Resumo:
"The Responsibility to Protect (R2P) is a major new international principle, adopted unanimously in 2005 by Heads of State and Government. Whilst it is broadly acknowledged that the principle has an important and intimate relationship with international law, especially the law relating to sovereignty, peace and security, human rights and armed conflict, there has yet to be a volume dedicated to this question. The Responsibility to Protect and International Law fills that gap by bringing together leading scholars from North America, Europe and Australia to examine R2P’s legal content. The Responsibility to Protect and International Law focuses on questions relating to R2P’s legal quality, its relationship with sovereignty, and the question of whether the norm establishes legal obligations. It also aims to introduce readers to different legal perspectives, including feminism, and pressing practical questions such as how the law might be used to prevent genocide and mass atrocities, and punish the perpetrators."--publisher website
Resumo:
"The Responsibility to Protect (R2P) is a major new international principle, adopted unanimously in 2005 by Heads of State and Government. Whilst it is broadly acknowledged that the principle has an important and intimate relationship with international law, especially the law relating to sovereignty, peace and security, human rights and armed conflict, there has yet to be a volume dedicated to this question. The Responsibility to Protect and International Law fills that gap by bringing together leading scholars from North America, Europe and Australia to examine R2P’s legal content."--publisher website