1000 resultados para Architecture, Classical.
Resumo:
The aging population in many countries brings into focus rising healthcare costs and pressure on conventional healthcare services. Pervasive healthcare has emerged as a viable solution capable of providing a technology-driven approach to alleviate such problems by allowing healthcare to move from the hospital-centred care to self-care, mobile care, and at-home care. The state-of-the-art studies in this field, however, lack a systematic approach for providing comprehensive pervasive healthcare solutions from data collection to data interpretation and from data analysis to data delivery. In this thesis we introduce a Context-aware Real-time Assistant (CARA) architecture that integrates novel approaches with state-of-the-art technology solutions to provide a full-scale pervasive healthcare solution with the emphasis on context awareness to help maintaining the well-being of elderly people. CARA collects information about and around the individual in a home environment, and enables accurately recognition and continuously monitoring activities of daily living. It employs an innovative reasoning engine to provide accurate real-time interpretation of the context and current situation assessment. Being mindful of the use of the system for sensitive personal applications, CARA includes several mechanisms to make the sophisticated intelligent components as transparent and accountable as possible, it also includes a novel cloud-based component for more effective data analysis. To deliver the automated real-time services, CARA supports interactive video and medical sensor based remote consultation. Our proposal has been validated in three application domains that are rich in pervasive contexts and real-time scenarios: (i) Mobile-based Activity Recognition, (ii) Intelligent Healthcare Decision Support Systems and (iii) Home-based Remote Monitoring Systems.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
In the presence of a chemical potential, the physics of level crossings leads to singularities at zero temperature, even when the spatial volume is finite. These singularities are smoothed out at a finite temperature but leave behind nontrivial finite size effects which must be understood in order to extract thermodynamic quantities using Monte Carlo methods, particularly close to critical points. We illustrate some of these issues using the classical nonlinear O(2) sigma model with a coupling β and chemical potential μ on a 2+1-dimensional Euclidean lattice. In the conventional formulation this model suffers from a sign problem at nonzero chemical potential and hence cannot be studied with the Wolff cluster algorithm. However, when formulated in terms of the worldline of particles, the sign problem is absent, and the model can be studied efficiently with the "worm algorithm." Using this method we study the finite size effects that arise due to the chemical potential and develop an effective quantum mechanical approach to capture the effects. As a side result we obtain energy levels of up to four particles as a function of the box size and uncover a part of the phase diagram in the (β,μ) plane. © 2010 The American Physical Society.
Resumo:
The variation and fugue originated from the 15th and 16th centuries and blossomed during the Baroque and Classical Periods. In a variation, a theme with a particular structure precedes a series of pieces that usually have the same or very similar structure. A fugue is a work written in imitative counterpoint in which the theme is stated successively in all voices of polyphonic texture. Beethoven’s use of variation and fugue in large scale works greatly influenced his contemporaries. After the Classical Period, variations continued to be popular, and numerous composers employed the technique in various musical genres. Fugues had pedagogical associations, and by the middle of 19th century became a requirement in conservatory instruction, modeled after Bach’s Well-Tempered Clavier. In the 20th century, the fugue was revived in the spirit of neoclassicism; it was incorporated in sonatas, and sets of preludes and fugues were composed. Schubert's Wanderer Fantasy presents his song Der Wanderer through thematic transformations, including a fugue and a set of variations. Liszt was highly influenced by this, as shown in his thematic transformations and the fugue as one of the transformations in his Sonata in b. In Schumann’s Symphonic Études, Rachmaninoff's Rhapsody on a Theme of Paganini and Copland’s Piano Variations, the variation serves as the basis for the entire work. Prokofiev and Schubert take a different approach in Piano Concerto No. 3 and Wanderer Fantasy, employing the variation in a single movement. Unlike Schubert and Liszt's use of the fugue as a part of the piece or movement, Franck’s Prelude Chorale et Fugue and Shchedrin’s Polyphonic Notebook use it in its independent form. Since the Classical Period, the variation and fugue have evolved from stylistic and technical influences of earlier composers. It is interesting and remarkable to observe the unique effects each had on a particular work. As true and dependable classic forms, they remain popular by offering the composer an organizational framework for musical imagination.
Resumo:
Although evidence of Gluck's influence on Mozart is sometimes discernible, by examining the two operas I have performed and a recital of arias by these two composers we can see clear contrasts in their approach to and expression of classical opera. The two operas discussed are Gluck's Armide and Mozart's Le Nozze di Figaro. Gluck and Mozart were both innovators but in very different ways. Gluck comes from a dramatic background (his principles have been compared to those of Wagner) and Mozart brings together dramatic excellence with the greatness of his musical genius, his gift of melody, and his ensemble writing, which is arguably unequaled in the repertory. A well-rounded performer strives to understand what the composer is really trying to say with his work, what the message to the audience is and what his particular way of conveying it is. The understanding of a composer's approach to drama and character interaction plays a huge role in character development. This applies no matter what role you are preparing whether it is baroque opera or late romantic. Discovering the ideals, style, and purpose of a composer contributes to an effective and rewarding performance experience, for those on stage, those in the pit, and those sitting in the seats.
Resumo:
This dissertation explores the transformation of opera comique (as represented by the opera Carmen) and the impact of verismo style (as represented by the opera La Boheme) upon the development of operetta, American musical theater and the resultant change in vocal style. Late nineteenth-century operetta called for a classically trained soprano voice with a clear vibrato. High tessitura and legato were expected although the quality of the voice was usually lighter in timbre. The dissertation comprises four programs that explore the transformation of vocal and compositional style into the current vocal performance practice of American musical theater. The first two programs are operatic roles and the last two are recital presentations of nineteenth- and twentieth- century operetta and musical theater repertoire. Program one, Carmen, was presented on July 26, 2007 at the Marshall Performing Arts Center in Duluth, MN where I sang the role of Micaela. Program two, La Boheme, was presented on May 24,2008 at Randolph Road Theater in Silver Spring, MD where I sang the role of Musetta. Program three, presented on December 2, 2008 and program four, presented on May 10, 2009 were two recitals featuring operetta and musical theater repertoire. These programs were heard in the Gildenhorn Recital Hall at the Clarice Smith Performing Arts Center in College Park, MD. Programs one and two are documented in a digital video format available on digital video disc. Programs three and four are documented in a digital audio format available on compact disc. All programs are accompanied by program notes also available in digital format.
Resumo:
Agrégation de l'enseignement supérieur, Orientation sciences
Resumo:
The conception of the FUELCON architecture, of a composite tool for the generation and validation of patterns for assigning fuel assemblies to the positions in the grid of a reactor core section, has undergone an evolution throughout the history of the project. Different options for various subtask were possible, envisioned, or actually explored or adopted. We project these successive, or even concomitant configurations of the architecture, into a meta-architecture, which quite not by chance happens to reflect basic choices in the field's history over the last decade.
Resumo:
This paper introduces a few architectural concepts from FUELGEN, that generates a "cloud" of reload patterns, like the generator in the FUELCON expert system, but unlike that generator, is based on a genetic algorithm. There are indications FUELGEN may outperform FUELCON and other tools as reported in the literature, in well-researched case studies, but careful comparisons have to be carried out. This paper complements the information in two other recent papers on FUELGEN. Moreover, a sequel project is outlined.
Resumo:
We continue the discussion of the decision points in the FUELCON metaarchitecture. Having discussed the relation of the original expert system to its sequel projects in terms of an AND/OR tree, we consider one further domain for a neural component: parameter prediction downstream of the core reload candidate pattern generator, thus, a replacement for the NOXER simulator currently in use in the project.
Resumo:
This paper describes the use of a blackboard architecture for building a hybrid case based reasoning (CBR) system. The Smartfire fire field modelling package has been built using this architecture and includes a CBR component. It allows the integration into the system of qualitative spatial reasoning knowledge from domain experts. The system can be used for the automatic set-up of fire field models. This enables fire safety practitioners who are not expert in modelling techniques to use a fire modelling tool. The paper discusses the integrating powers of the architecture, which is based on a common knowledge representation comprising a metric diagram and place vocabulary and mechanisms for adaptation and conflict resolution built on the Blackboard.
Resumo:
This paper presents an investigation into applying Case-Based Reasoning to Multiple Heterogeneous Case Bases using agents. The adaptive CBR process and the architecture of the system are presented. A case study is presented to illustrate and evaluate the approach. The process of creating and maintaining the dynamic data structures is discussed. The similarity metrics employed by the system are used to support the process of optimisation of the collaboration between the agents which is based on the use of a blackboard architecture. The blackboard architecture is shown to support the efficient collaboration between the agents to achieve an efficient overall CBR solution, while using case-based reasoning methods to allow the overall system to adapt and “learn” new collaborative strategies for achieving the aims of the overall CBR problem solving process.
Resumo:
This paper describes a highly flexible component architecture, primarily designed for automotive control systems, that supports distributed dynamically- configurable context-aware behaviour. The architecture enforces a separation of design-time and run-time concerns, enabling almost all decisions concerning runtime composition and adaptation to be deferred beyond deployment. Dynamic context management contributes to flexibility. The architecture is extensible, and can embed potentially many different self-management decision technologies simultaneously. The mechanism that implements the run-time configuration has been designed to be very robust, automatically and silently handling problems arising from the evaluation of self- management logic and ensuring that in the worst case the dynamic aspects of the system collapse down to static behavior in totally predictable ways.
Resumo:
This paper proposes a vehicular control system architecture that supports self-configuration. The architecture is based on dynamic mapping of processes and services to resources to meet the challenges of future demanding use-scenarios in which systems must be flexible to exhibit context-aware behaviour and to permit customization. The architecture comprises a number of low-level services that provide the required system functionalities, which include automatic discovery and incorporation of new devices, self-optimisation to best-use the processing, storage and communication resources available, and self-diagnostics. The benefits and challenges of dynamic configuration and the automatic inclusion of users' Consumer Electronic (CE) devices are briefly discussed. The dynamic configuration and control-theoretic technologies used are described in outline and the way in which the demands of highly flexible dynamic configuration and highly robust operation are simultaneously met without compromise, is explained. A number of generic use-cases have been identified, each with several specific use-case scenarios. One generic use-case is described to provide an insight into the extent of the flexible reconfiguration facilitated by the architecture.