993 resultados para Supramolecular architecture


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Focussing on Paul Rudolph’s Art & Architecture Building at Yale, this thesis demonstrates how the building synthesises the architect’s attitude to architectural education, urbanism and materiality. It tracks the evolution of the building from its origins – which bear a relationship to Rudolph’s pedagogical ideas – to later moments when its occupants and others reacted to it in a series of ways that could never have been foreseen. The A&A became the epicentre of the university’s counter culture movement before it was ravaged by a fire of undetermined origins. Arguably, it represents the last of its kind in American architecture, a turning point at the threshold of postmodernism. Using an archive that was only made available to researchers in 2009, this is the first study to draw extensively on the research files of the late architectural writer and educator, C. Ray Smith. Smith’s 1981 manuscript about the A&A entitled “The Biography of a Building,” was never published. The associated research files and transcripts of discussions with some thirty interviewees, including Rudolph, provide a previously unavailable wealth of information. Following Smith’s methodology, meetings were recorded with those involved in the A&A including, where possible, some of Smith’s original interviewees. When placed within other significant contexts – the physicality of the building itself as well as the literature which surrounds it – these previously untold accounts provide new perspectives and details, which deepen the understanding of the building and its place within architectural discourse. Issues revealed include the importance of the influence of Louis Kahn’s Yale Art Gallery and Yale’s Collegiate Gothic Campus on the building’s design. Following a tumultuous first fifty years, the A&A remains an integral part of the architectural education of Yale students and, furthermore, constitutes an important didactic tool for all students of architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aging population in many countries brings into focus rising healthcare costs and pressure on conventional healthcare services. Pervasive healthcare has emerged as a viable solution capable of providing a technology-driven approach to alleviate such problems by allowing healthcare to move from the hospital-centred care to self-care, mobile care, and at-home care. The state-of-the-art studies in this field, however, lack a systematic approach for providing comprehensive pervasive healthcare solutions from data collection to data interpretation and from data analysis to data delivery. In this thesis we introduce a Context-aware Real-time Assistant (CARA) architecture that integrates novel approaches with state-of-the-art technology solutions to provide a full-scale pervasive healthcare solution with the emphasis on context awareness to help maintaining the well-being of elderly people. CARA collects information about and around the individual in a home environment, and enables accurately recognition and continuously monitoring activities of daily living. It employs an innovative reasoning engine to provide accurate real-time interpretation of the context and current situation assessment. Being mindful of the use of the system for sensitive personal applications, CARA includes several mechanisms to make the sophisticated intelligent components as transparent and accountable as possible, it also includes a novel cloud-based component for more effective data analysis. To deliver the automated real-time services, CARA supports interactive video and medical sensor based remote consultation. Our proposal has been validated in three application domains that are rich in pervasive contexts and real-time scenarios: (i) Mobile-based Activity Recognition, (ii) Intelligent Healthcare Decision Support Systems and (iii) Home-based Remote Monitoring Systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular theories of shear thickening and shear thinning in associative polymer networks are typically united in that they involve a single kinetic parameter that describes the network -- a relaxation time that is related to the lifetime of the associative bonds. Here we report the steady-shear behavior of two structurally identical metallo-supramolecular polymer networks, for which single-relaxation parameter models break down in dramatic fashion. The networks are formed by the addition of reversible cross-linkers to semidilute entangled solutions of PVP in DMSO, and they differ only in the lifetime of the reversible cross-links. Shear thickening is observed for cross-linkers that have a slower dissociation rate (17 s(-1)), while shear thinning is observed for samples that have a faster dissociation rate (ca. 1400 s(-1)). The difference in the steady shear behavior of the unentangled vs. entangled regime reveals an unexpected, additional competing relaxation, ascribed to topological disentanglement in the semidilute entangled regime that contributes to the rheological properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report here the nonlinear rheological properties of metallo-supramolecular networks formed by the reversible cross-linking of semi-dilute unentangled solutions of poly(4-vinylpyridine) (PVP) in dimethyl sulfoxide (DMSO). The reversible cross-linkers are bis-Pd(II) or bis-Pt(II) complexes that coordinate to the pyridine functional groups on the PVP. Under steady shear, shear thickening is observed above a critical shear rate, and that critical shear rate is experimentally correlated with the lifetime of the metal-ligand bond. The onset and magnitude of the shear thickening depend on the amount of cross-linkers added. In contrast to the behavior observed in most transient networks, the time scale of network relaxation is found to increase during shear thickening. The primary mechanism of shear thickening is ascribed to the shear-induced transformation of intrachain cross-linking to interchain cross-linking, rather than nonlinear high tension along polymer chains that are stretched beyond the Gaussian range.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conception of the FUELCON architecture, of a composite tool for the generation and validation of patterns for assigning fuel assemblies to the positions in the grid of a reactor core section, has undergone an evolution throughout the history of the project. Different options for various subtask were possible, envisioned, or actually explored or adopted. We project these successive, or even concomitant configurations of the architecture, into a meta-architecture, which quite not by chance happens to reflect basic choices in the field's history over the last decade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a few architectural concepts from FUELGEN, that generates a "cloud" of reload patterns, like the generator in the FUELCON expert system, but unlike that generator, is based on a genetic algorithm. There are indications FUELGEN may outperform FUELCON and other tools as reported in the literature, in well-researched case studies, but careful comparisons have to be carried out. This paper complements the information in two other recent papers on FUELGEN. Moreover, a sequel project is outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We continue the discussion of the decision points in the FUELCON metaarchitecture. Having discussed the relation of the original expert system to its sequel projects in terms of an AND/OR tree, we consider one further domain for a neural component: parameter prediction downstream of the core reload candidate pattern generator, thus, a replacement for the NOXER simulator currently in use in the project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the use of a blackboard architecture for building a hybrid case based reasoning (CBR) system. The Smartfire fire field modelling package has been built using this architecture and includes a CBR component. It allows the integration into the system of qualitative spatial reasoning knowledge from domain experts. The system can be used for the automatic set-up of fire field models. This enables fire safety practitioners who are not expert in modelling techniques to use a fire modelling tool. The paper discusses the integrating powers of the architecture, which is based on a common knowledge representation comprising a metric diagram and place vocabulary and mechanisms for adaptation and conflict resolution built on the Blackboard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an investigation into applying Case-Based Reasoning to Multiple Heterogeneous Case Bases using agents. The adaptive CBR process and the architecture of the system are presented. A case study is presented to illustrate and evaluate the approach. The process of creating and maintaining the dynamic data structures is discussed. The similarity metrics employed by the system are used to support the process of optimisation of the collaboration between the agents which is based on the use of a blackboard architecture. The blackboard architecture is shown to support the efficient collaboration between the agents to achieve an efficient overall CBR solution, while using case-based reasoning methods to allow the overall system to adapt and “learn” new collaborative strategies for achieving the aims of the overall CBR problem solving process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a highly flexible component architecture, primarily designed for automotive control systems, that supports distributed dynamically- configurable context-aware behaviour. The architecture enforces a separation of design-time and run-time concerns, enabling almost all decisions concerning runtime composition and adaptation to be deferred beyond deployment. Dynamic context management contributes to flexibility. The architecture is extensible, and can embed potentially many different self-management decision technologies simultaneously. The mechanism that implements the run-time configuration has been designed to be very robust, automatically and silently handling problems arising from the evaluation of self- management logic and ensuring that in the worst case the dynamic aspects of the system collapse down to static behavior in totally predictable ways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a vehicular control system architecture that supports self-configuration. The architecture is based on dynamic mapping of processes and services to resources to meet the challenges of future demanding use-scenarios in which systems must be flexible to exhibit context-aware behaviour and to permit customization. The architecture comprises a number of low-level services that provide the required system functionalities, which include automatic discovery and incorporation of new devices, self-optimisation to best-use the processing, storage and communication resources available, and self-diagnostics. The benefits and challenges of dynamic configuration and the automatic inclusion of users' Consumer Electronic (CE) devices are briefly discussed. The dynamic configuration and control-theoretic technologies used are described in outline and the way in which the demands of highly flexible dynamic configuration and highly robust operation are simultaneously met without compromise, is explained. A number of generic use-cases have been identified, each with several specific use-case scenarios. One generic use-case is described to provide an insight into the extent of the flexible reconfiguration facilitated by the architecture.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development ­Environments (DECADE). A brief discussion sets the background for IoT, and the development of the ­distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, ­local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and ­quantitative ­analysis ­carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service ­architecture, ­combining a distributed data warehouse, web services for analysis agents, ontology agents and a ­verification engine, with a centrally verified outcome database maintained by certifying body for qualification/­professional status.