907 resultados para Building heating systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

To achieve the goal of sustainable development, the building energy system was evaluated from both the first and second law of thermodynamics point of view. The relationship between exergy destruction and sustainable development were discussed at first, followed by the description of the resource abundance model, the life cycle analysis model and the economic investment effectiveness model. By combining the forgoing models, a new sustainable index was proposed. Several green building case studies in U.S. and China were presented. The influences of building function, geographic location, climate pattern, the regional energy structure, and the technology improvement potential of renewable energy in the future were discussed. The building’s envelope, HVAC system, on-site renewable energy system life cycle analysis from energy, exergy, environmental and economic perspective were compared. It was found that climate pattern had a dramatic influence on the life cycle investment effectiveness of the building envelope. The building HVAC system energy performance was much better than its exergy performance. To further increase the exergy efficiency, renewable energy rather than fossil fuel should be used as the primary energy. A building life cycle cost and exergy consumption regression model was set up. The optimal building insulation level could be affected by either cost minimization or exergy consumption minimization approach. The exergy approach would cause better insulation than cost approach. The influence of energy price on the system selection strategy was discussed. Two photovoltaics (PV) systems – stand alone and grid tied system were compared by the life cycle assessment method. The superiority of the latter one was quite obvious. The analysis also showed that during its life span PV technology was less attractive economically because the electricity price in U.S. and China did not fully reflect the environmental burden associated with it. However if future energy price surges and PV system cost reductions were considered, the technology could be very promising for sustainable buildings in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation documents the results of a theoretical and numerical study of time dependent storage of energy by melting a phase change material. The heating is provided along invading lines, which change from single-line invasion to tree-shaped invasion. Chapter 2 identifies the special design feature of distributing energy storage in time-dependent fashion on a territory, when the energy flows by fluid flow from a concentrated source to points (users) distributed equidistantly on the area. The challenge in this chapter is to determine the architecture of distributed energy storage. The chief conclusion is that the finite amount of storage material should be distributed proportionally with the distribution of the flow rate of heating agent arriving on the area. The total time needed by the source stream to ‘invade’ the area is cumulative (the sum of the storage times required at each storage site), and depends on the energy distribution paths and the sequence in which the users are served by the source stream. Chapter 3 shows theoretically that the melting process consists of two phases: “invasion” thermal diffusion along the invading line, which is followed by “consolidation” as heat diffuses perpendicularly to the invading line. This chapter also reports the duration of both phases and the evolution of the melt layer around the invading line during the two-dimensional and three-dimensional invasion. It also shows that the amount of melted material increases in time according to a curve shaped as an S. These theoretical predictions are validated by means of numerical simulations in chapter 4. This chapter also shows that the heat transfer rate density increases (i.e., the S curve becomes steeper) as the complexity and number of degrees of freedom of the structure are increased, in accord with the constructal law. The optimal geometric features of the tree structure are detailed in this chapter. Chapter 5 documents a numerical study of time-dependent melting where the heat transfer is convection dominated, unlike in chapter 3 and 4 where the melting is ruled by pure conduction. In accord with constructal design, the search is for effective heat-flow architectures. The volume-constrained improvement of the designs for heat flow begins with assuming the simplest structure, where a single line serves as heat source. Next, the heat source is endowed with freedom to change its shape as it grows. The objective of the numerical simulations is to discover the geometric features that lead to the fastest melting process. The results show that the heat transfer rate density increases as the complexity and number of degrees of freedom of the structure are increased. Furthermore, the angles between heat invasion lines have a minor effect on the global performance compared to other degrees of freedom: number of branching levels, stem length, and branch lengths. The effect of natural convection in the melt zone is documented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis argues that complex adaptive social–ecological systems (SES) theory has important implications for the design of integrated ocean and coastal governance in the EU. Traditional systems of governance have struggled to deal with the global changes, complexity and uncertainties that challenge a transition towards sustainability in Europe’s maritime macro-regions. There is an apparent disconnect between governance strategies for sustainability in Europe’s maritime macro-regions and a sound theoretical basis for them. My premise is that the design of governance architecture for maritime regional sustainability should be informed by SES theory. Therefore, the aim of this research was to gain insight into a multilevel adaptive governance architecture that combines notions of sustainability and development in the context of the Atlantic Europe maritime macro-region. The central research question asked whether it is possible to achieve this insight by using a SES as a framework and analytical tool. This research adopted social ecology and sustainability science as a foundation for understanding society–nature relations. Concepts from complex adaptive systems, SES and resilience theories were integrated into a conceptual framework that guided the investigation and analysis. A study was conducted to conceptualise the European Atlantic social–ecological system (EASES). This was used to represent and understand the Atlantic Europe macro-region as a SES. The study examined the proposition that governance can be focused on building SES resilience to help achieve maritime regional sustainability. A workbook method was developed and used to elicit expert opinion regarding EASES. The study identified sources of resilience and resilience dynamics that require management in the context of multilevel adaptive governance. This research found that the Atlantic Europe macro-region is a key focal level for multilevel adaptive governance architecture. The majority of the findings are specific to Atlantic Europe and not generalisable to other maritime macro-regions in Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The section of CN railway between Vancouver and Kamloops runs along the base of many hazardous slopes, including the White Canyon, which is located just outside the town of Lytton, BC. The slope has a history of frequent rockfall activity, which presents a hazard to the railway below. Rockfall inventories can be used to understand the frequency-magnitude relationship of events on hazardous slopes, however it can be difficult to consistently and accurately identify rockfall source zones and volumes on large slopes with frequent activity, leaving many inventories incomplete. We have studied this slope as a part of the Canadian Railway Ground Hazard Research Program and have collected remote sensing data, including terrestrial laser scanning (TLS), photographs, and photogrammetry data since 2012, and used change detection to identify rockfalls on the slope. The objective of this thesis is to use a subset of this data to understand how rockfalls identified from TLS data could be used to understand the frequency-magnitude relationship of rockfalls on the slope. This includes incorporating both new and existing methods to develop a semi-automated workflow to extract rockfall events from the TLS data. We show that these methods can be used to identify events as small as 0.01 m3 and that the duration between scans can have an effect on the frequency-magnitude relationship of the rockfalls. We also show that by incorporating photogrammetry data into our analysis, we can create a 3D geological model of the slope and use this to classify rockfalls by lithology, to further understand the rockfall failure patterns. When relating the rockfall activity to triggering factors, we found that the amount of precipitation occurring over the winter has an effect on the overall rockfall frequency for the remainder of the year. These results can provide the railways with a more complete inventory of events compared to records created through track inspection, or rockfall monitoring systems that are installed on the slope. In addition, we can use the database to understand the spatial and temporal distribution of events. The results can also be used as an input to rockfall modelling programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a substantial effort to build a real-time interactive multimodal dialogue system with a focus on emotional and non-verbal interaction capabilities. The work is motivated by the aim to provide technology with competences in perceiving and producing the emotional and non-verbal behaviours required to sustain a conversational dialogue. We present the Sensitive Artificial Listener (SAL) scenario as a setting which seems particularly suited for the study of emotional and non-verbal behaviour, since it requires only very limited verbal understanding on the part of the machine. This scenario allows us to concentrate on non-verbal capabilities without having to address at the same time the challenges of spoken language understanding, task modeling etc. We first summarise three prototype versions of the SAL scenario, in which the behaviour of the Sensitive Artificial Listener characters was determined by a human operator. These prototypes served the purpose of verifying the effectiveness of the SAL scenario and allowed us to collect data required for building system components for analysing and synthesising the respective behaviours. We then describe the fully autonomous integrated real-time system we created, which combines incremental analysis of user behaviour, dialogue management, and synthesis of speaker and listener behaviour of a SAL character displayed as a virtual agent. We discuss principles that should underlie the evaluation of SAL-type systems. Since the system is designed for modularity and reuse, and since it is publicly available, the SAL system has potential as a joint research tool in the affective computing research community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Food production and consumption for cities has become a global concern due to increasing numbers of people living in urban areas, threatening food security. There is the contention that people living in cities have become disconnected with food production, leading to reduced nutrition in diets and increased food waste. Integrating food production into cities (urban agriculture) can help alleviate some of these issues. Lack of space at ground level in high-density urban areas has accelerated the idea of using spare building surfaces for food production. There are various growing methods being used for food production on buildings, which can be split into two main types, soil-less systems and soil-based systems. This paper is a holistic assessment (underpinned by the triple bottom line of sustainable development) of these two types of systems for food production on buildings, looking at the benefits and limitation of each type in this context. The results illustrate that soil-less systems are more productive per square metre, which increases the amount of locally grown, fresh produce available in urban areas. The results also show that soil-based systems for cultivation on buildings are more environmentally and socially beneficial overall for urban areas than soil-less systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local communities collectively managing common pool resources can play an important role in sustainable management, but they often lack the skills and context-specific tools required for such management. The complex dynamics of social-ecological systems (SES), the need for management capacities, and communities’ limited empowerment and participation skills present challenges for community-based natural resource management (CBNRM) strategies. We analyzed the applicability of prospective structural analysis (PSA), a strategic foresight tool, to support decision making and to foster sustainable management and capacity building in CBNRM contexts and the modifications necessary to use the tool in such contexts. By testing PSA in three SES in Colombia, Mexico, and Argentina, we gathered information regarding the potential of this tool and its adaptation requirements. The results suggest that the tool can be adapted to these contexts and contribute to fostering sustainable management and capacity building. It helped identify the systems’ dynamics, thus increasing the communities’ knowledge about their SES and informing the decision-making process. Additionally, it drove a learning process that both fostered empowerment and built participation skills. The process demanded both time and effort, and required external monitoring and facilitation, but community members could be trained to master it. Thus, we suggest that the PSA technique has the potential to strengthen CBNRM and that other initiatives could use it, but they must be aware of these requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While coaching and customer involvement can enhance the improvement of health and social care, many organizations struggle to develop their improvement capability; it is unclear how best to accomplish this. We examined one attempt at training improvement coaches. The program, set in the Esther Network for integrated care in rural Jonkoping County, Sweden, included eight 1-day sessions spanning 7 months in 2011. A senior citizen joined the faculty in all training sessions. Aiming to discern which elements in the program were essential for assuming the role of improvement coach, we used a case-study design with a qualitative approach. Our focus group interviews included 17 informants: 11 coaches, 3 faculty members, and 3 senior citizens. We performed manifest content analysis of the interview data. Creating will, ideas, execution, and sustainability emerged as crucial elements. These elements were promoted by customer focusembodied by the senior citizen trainershared values and a solution-focused approach, by the supportive coach network and by participants' expanded systems understanding. These elements emerged as more important than specific improvement tools and are worth considering also elsewhere when seeking to develop improvement capability in health and social care organizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the users’ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as “Proactive Context-aware Computing”. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on users’ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate users’ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of users’ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine users’ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ‘Locus’, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of users’ context include the activities that they are engaged in. To this end, we have developed ‘SenseMe’, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ‘SenseMe’ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to users’ situations, we have developed ‘TellMe’ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of users’ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict users’ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last two decades, experimental progress in controlling cold atoms and ions now allows us to manipulate fragile quantum systems with an unprecedented degree of precision. This has been made possible by the ability to isolate small ensembles of atoms and ions from noisy environments, creating truly closed quantum systems which decouple from dissipative channels. However in recent years, several proposals have considered the possibility of harnessing dissipation in open systems, not only to cool degenerate gases to currently unattainable temperatures, but also to engineer a variety of interesting many-body states. This thesis will describe progress made towards building a degenerate gas apparatus that will soon be capable of realizing these proposals. An ultracold gas of ytterbium atoms, trapped by a species-selective lattice will be immersed into a Bose-Einstein condensate (BEC) of rubidium atoms which will act as a bath. Here we describe the challenges encountered in making a degenerate mixture of rubidium and ytterbium atoms and present two experiments performed on the path to creating a controllable open quantum system. The first experiment will describe the measurement of a tune-out wavelength where the light shift of $\Rb{87}$ vanishes. This wavelength was used to create a species-selective trap for ytterbium atoms. Furthermore, the measurement of this wavelength allowed us to extract the dipole matrix element of the $5s \rightarrow 6p$ transition in $\Rb{87}$ with an extraordinary degree of precision. Our method to extract matrix elements has found use in atomic clocks where precise knowledge of transition strengths is necessary to account for minute blackbody radiation shifts. The second experiment will present the first realization of a degenerate Bose-Fermi mixture of rubidium and ytterbium atoms. Using a three-color optical dipole trap (ODT), we were able to create a highly-tunable, species-selective potential for rubidium and ytterbium atoms which allowed us to use $\Rb{87}$ to sympathetically cool $\Yb{171}$ to degeneracy with minimal loss. This mixture is the first milestone creating the lattice-bath system and will soon be used to implement novel cooling schemes and explore the rich physics of dissipation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis uses a three-dimensional, first-principles model of the ionosphere in combination with High Frequency (HF) raytracing model to address key topics related to the physics of HF propagation and artificial ionospheric heating. In particular: 1. Explores the effect of the ubiquitous electron density gradients caused by Medium Scale Traveling Ionospheric Disturbances (MSTIDs) on high-angle of incidence HF radio wave propagation. Previous studies neglected the all-important presence of horizontal gradients in both the cross- and down-range directions, which refract the HF waves, significantly changing their path through the ionosphere. The physics-based ionosphere model SAMI3/ESF is used to generate a self-consistently evolving MSTID that allows for the examination of the spatio-temporal progression of the HF radio waves in the ionosphere. 2. Tests the potential and determines engineering requirements for ground- based high power HF heaters to trigger and control the evolution of Equatorial Spread F (ESF). Interference from ESF on radio wave propagation through the ionosphere remains a critical issue on HF systems reliability. Artificial HF heating has been shown to create plasma density cavities in the ionosphere similar to those that may trigger ESF bubbles. The work explores whether HF heating may trigger or control ESF bubbles. 3. Uses the combined ionosphere and HF raytracing models to create the first self-consistent HF Heating model. This model is utilized to simulate results from an Arecibo experiment and to provide understanding of the physical mechanism behind observed phenomena. The insights gained provide engineering guidance for new artificial heaters that are being built for use in low to middle latitude regions. In accomplishing the above topics: (i) I generated a model MSTID using the SAMI3/ESF code, and used a raytrace model to examine the effects of the MSTID gradients on radio wave propagation observables; (ii) I implemented a three- dimensional HF heating model in SAMI3/ESF and used the model to determine whether HF heating could artificially generate an ESF bubble; (iii) I created the first self-consistent model for artificial HF heating using the SAMI3/ESF ionosphere model and the MoJo raytrace model and ran a series of simulations that successfully modeled the results of early artificial heating experiments at Arecibo.