874 resultados para practical turn
Resumo:
This work considers the effect of hardware constraints that typically arise in practical power-aware wireless sensor network systems. A rigorous methodology is presented that quantifies the effect of output power limit and quantization constraints on bit error rate performance. The approach uses a novel, intuitively appealing means of addressing the output power constraint, wherein the attendant saturation block is mapped from the output of the plant to its input and compensation is then achieved using a robust anti-windup scheme. A priori levels of system performance are attained using a quantitative feedback theory approach on the initial, linear stage of the design paradigm. This hybrid design is assessed experimentally using a fully compliant 802.15.4 testbed where mobility is introduced through the use of autonomous robots. A benchmark comparison between the new approach and a number of existing strategies is also presented.
Resumo:
Though the motivation for developing Ambient Assisted Living (AAL) systems is incontestable, significant challenges exist in realizing the ambience that is essential to the success of such systems. By definition, an AAL system must be omnipresent, tracking occupant activities in the home and identifying those situations where assistance is needed or would be welcomed. Embedded sensors offer an attractive mechanism for realizing ambience as their form factor and harnessing of wireless technologies aid in their seamless integration into pre-existing environments. However, the heterogeneity of the end-user population, their disparate needs and the differing environments in which they inhabit, all pose particular problems regarding sensor integration and management
Resumo:
The power consumption of wireless sensor networks (WSN) module is an important practical concern in building energy management (BEM) system deployments. A set of metrics are created to assess the power profiles of WSN in real world condition. The aim of this work is to understand and eventually eliminate the uncertainties in WSN power consumption during long term deployments and the compatibility with existing and emerging energy harvesting technologies. This paper investigates the key metrics in data processing, wireless data transmission, data sensing and duty cycle parameter to understand the system power profile from a practical deployment prospective. Based on the proposed analysis, the impacts of individual metric on power consumption in a typical BEM application are presented and the subsequent low power solutions are investigated.
Resumo:
This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.
Resumo:
This dissertation involves a general overview of the meditative practice of zazen and analytic philosophy of mind while suggesting a potential bridge between them in the form of an analysis of the practicality of realising impermanence. By the end of my argument I hope to have offered up some compelling evidence in favour of the idea that analytic philosophy would benefit greatly from adopting principles which are best learned and expressed through the practice of, and scholarship around, Zen Buddhism and in particular the treatment of the concept of impermanence. I demonstrate the Western philosophical tendency to make dichotomous assumptions about the nature of mind, even when explicitly denying a dualist framework. I do so by examining the historical and philosophical precedent for dualistic thinking in the work of figures such as Plato and Descartes. I expand on this idea by examining the psychology of categorisation - i.e. creating mental categories and boundaries - and demonstrating how such categorisations feeds back into behaviour in practical ways, both positive and negative. The Zen Buddhist principle of impermanence states that all phenomena are impermanent and therefore lack essential nature; this includes intellectual concepts such as the metaphysical framework of the analytic approach to mind. Impermanence is a principle which is realised through the embodied practice of zazen. By demonstrating its application to analytic philosophy of mind I show that zazen (and mindfulness practice in general) provides an ongoing opportunity for clearing up entrenched world views, metaphysical assumptions and dogmatic thinking. This in turn may promote a more holistic and ultimately more rewarding comprehension of the role of first-person experience in understanding the world. My argument is not limited to analytic philosophy of mind but reflects broad aspects of thinking in general, and I explain its application to issues of social importance, in particular education systems.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
Cultural Marxist Theory, commonly known as theory, enjoyed a moment of extraordinary success in the 1970s, when the works of leading post-war French philosophers were published in English. After relocating to Anglophone academia, however, theory disavowed its original concerns and lost its ambition to understand the world as a whole, becoming the play of heterogeneities associated with postcolonialism, multiculturalism and identity politics, commonly referred to as postmodern theory. This turn, which took place during a period that seemed to have spelt the death of Marxism, the 1990s, induced many of its supporters to engage in an ongoing funeral wake designating the merits of theory and dreaming its resurgence. According to them, had theory been resurrected in historical circumstances completely different from those which had led to its rise, it would have never reacquired the significance that had originally connoted it. This thesis demonstrates how theory has survived its demise and entirely regained its prominence in our socio-political context marked by the effects of the latest crisis of capitalism and by the global threat of terrorisms rooted in messianic eschatologies. In its current form theory does no longer need to show allegiance to certain intellectual stances or political groupings in order to produce important reformulations of the projects it once gave life to. Though less overtly radical and epistemologically bounded, theory remains a necessary form of enquiry justified by the political commitment which originated it in the first place. Its voice continues to speak to us about justice ‘where it is not yet, not yet there, where it is no longer’ (Derrida, 1993, XVIII).
Resumo:
The sonata began to lose its position of predominance among compositions in the middle of the 19th century. Having been the platform for harmonic and thematic development of music since the late baroque period the sonata entered a process of reevaluation and experimentation with form. As a result fewer sonatas were being composed with some composers dropping the genre completely. This dissertation looks at the different approaches taken by the German, French and Russian schools of composition and compares the solo and chamber music applications of the sonata form. In the German tradition Franz Liszt's Sonata in b minor sets the standard for the revolutionary approach to form while the Berg Sonata is a very conservative application of form to an innovative use of extended chromaticism. Both composers chose to write one movement through composed pieces with Liszt working with a very expansive use of form and Berg being extremely compact and efficient. Among the Russian composers, Prokofieff's third sonata is also a one movement sonata, but he falls between Liszt and Berg in terms of the length of the piece and the use of innovative musical language. Scriabin uses a two movement approach, but keeps the element of a through composed piece with the same important material spanning both movements. Stravinsky is the most conservative of these with a three movement sonata that uses a mix of chromaticism and baroque and classical style influences. The French almost stopped composing true sonatas except for chamber music where Franck and Fauré write late romantic sonatas, while Debussy is very innovative within a three movement sonata. Estampes, by Debussy, are taken in almost as an afterthought to illustrate the direction Debussy takes in his piano solo music. While Estampes is by definition a set of character pieces they function like a sonata with three movements.
Resumo:
With the lifetime risk of being diagnosed with prostate cancer so great, an effective chemopreventive agent could have a profound impact on the lives of men. Despite decades of searching for such an agent, physicians still do not have an approved drug to offer their patients. In this article, we outline current strategies for preventing prostate cancer in general, with a focus on the 5-α-reductase inhibitors (5-ARIs) finasteride and dutasteride. We discuss the two landmark randomized, controlled trials of finasteride and dutasteride, highlighting the controversies stemming from the results, and address the issue of 5-ARI use, including reasons why providers may be hesitant to use these agents for chemoprevention. We further discuss the recent US Food and Drug Administration ruling against the proposed new indication for dutasteride and the change to the labeling of finasteride, both of which were intended to permit physicians to use the drugs for chemoprevention. Finally, we discuss future directions for 5-ARI research.
Resumo:
Although the prognosis of ambulatory heart failure (HF) has improved dramatically there have been few advances in the management of acute HF (AHF). Despite regional differences in patient characteristics, background therapy, and event rates, AHF clinical trial enrollment has transitioned from North America and Western Europe to Eastern Europe, South America, and Asia-Pacific where regulatory burden and cost of conducting research may be less prohibitive. It is unclear if the results of clinical trials conducted outside of North America are generalizable to US patient populations. This article uses AHF as a paradigm and identifies barriers and practical solutions to successfully conducting site-based research in North America.
Resumo:
Social network analysts have tried to capture the idea of social role explicitly by proposing a framework that precisely gives conditions under which group actors are playing equivalent roles. They term these methods positional analysis techniques. The most general definition is regular equivalence which captures the idea that equivalent actors are related in a similar way to equivalent alters. Regular equivalence gives rise to a whole class of partitions on a network. Given a network we have two different computational problems. The first is how to find a particular regular equivalence. An algorithm exists to find the largest regular partition but there are not efficient algorithms to test whether there is a regular k-partition. That is a partition in k groups that is regular. In addition, when dealing with real data, it is unlikely that any regular partitions exist. To overcome this problem relaxations of regular equivalence have been proposed along with optimisation techniques to find nearly regular partitions. In this paper we review the algorithms that have developed to find particular regular equivalences and look at some of the recent theoretical results which give an insight into the complexity of finding regular partitions.
Resumo:
The results of a finite element computer modelling analysis of a micro-manufactured one-turn magnetic inductor using the software package ANSYS 10.0 are presented. The inductor is designed for a DC-DC converter used in microelectronic devices. It consists of a copper conductor with a rectangular cross-section plated with an insulation layer and a layer of magnetic core. The analysis has focused on the effects of the frequency and the air gaps on the on the inductance values and the Joule losses in the core and conductor. It has been found that an inductor with small multiple air gaps has lower losses than an inductor with a single larger gap
Resumo:
Counter-current chromatography (CCC) is a technique that shows a lot of potential for large scale purification. Its usefulness in a "research and development" pharmaceutical environment has been investigated, and the conclusions are shown in this article. The use of CCC requires the development of an appropriate solvent system (a parameter of critical importance), a process which can be tedious. This article presents a novel strategy, combining a statistical approach and fast HPLC to generate a three-dimensional partition coefficient map and rapidly predict an optimal solvent system. This screen is performed in half a day and involves 9 experiments per solvent mixture. Test separations were performed using that screen to ensure the validity of the method.