869 resultados para Building Design Process


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automation technologies are widely acclaimed to have the potential to significantly reduce energy consumption and energy-related costs in buildings. However, despite the abundance of commercially available technologies, automation in domestic environments keep on meeting commercial failures. The main reason for this is the development process that is used to build the automation applications, which tend to focus more on technical aspects rather than on the needs and limitations of the users. An instance of this problem is the complex and poorly designed home automation front-ends that deter customers from investing in a home automation product. On the other hand, developing a usable and interactive interface is a complicated task for developers due to the multidisciplinary challenges that need to be identified and solved. In this context, the current research work investigates the different design problems associated with developing a home automation interface as well as the existing design solutions that are applied to these problems. The Qualitative Data Analysis approach was used for collecting data from research papers and the open coding process was used to cluster the findings. From the analysis of the data collected, requirements for designing the interface were derived. A home energy management functionality for a Web-based home automation front-end was developed as a proof-of-concept and a user evaluation was used to assess the usability of the interface. The results of the evaluation showed that this holistic approach to designing interfaces improved its usability which increases the chances of its commercial success.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Part 12: Collaboration Platforms

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, focus groups, surveys, usability tests, case studies, diary studies, ethnography, contextual inquiry, experience sampling, and automated data collection. In this paper, we report on our experience using the evaluation methods focus groups, surveys and interviews and how we adopted these methods to develop artefacts: either interface’s design or information and technological systems. Four projects are examples of the different methods application to gather information about user’s wants, habits, practices, concerns and preferences. The goal was to build an understanding of the attitudes and satisfaction of the people who might interact with a technological artefact or information system. Conversely, we intended to design for information systems and technological applications, to promote resilience in organisations (a set of routines that allow to recover from obstacles) and user’s experiences. Organisations can here also be viewed within a system approach, which means that the system perturbations even failures could be characterized and improved. The term resilience has been applied to everything from the real estate, to the economy, sports, events, business, psychology, and more. In this study, we highlight that resilience is also made up of a number of different skills and abilities (self-awareness, creating meaning from other experiences, self-efficacy, optimism, and building strong relationships) that are a few foundational ingredients, which people should use along with the process of enhancing an organisation’s resilience. Resilience enhances knowledge of resources available to people confronting existing problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The current knowledge revised in this article describes a wide range of facilities in common use on dairy cattle farms in warm climates. A dairy cattle farm consists of several facilities, such as housing system, yards, manure pits, milking center, environmental protection structures, forage storage, and several machines for different facilities. Any facility design tends to be a compromise, often between many factors, and no single solution will be optimal for all concerned.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article aims to explore the relationship between clients´ narrative transformation and the promotion of vocational decidedness and career maturity in a mid-adolescent case of Life Design Counseling (LDC). To assess LDC outcomes the Vocational Certainty Scale and the Career Maturity Inventory – Form C were used before and after the intervention. To intensively analyze the process of LDC change two measures of narrative change were used: the Innovative Moments Coding System (IMCS), as a measure of innovation emergence, and the Return to the Problem Coding System (RPCS), as a measure of ambivalence towards change. The results show that the three LDC sessions produced a significant change in vocational certainty but not in career maturity. Findings confirm that the process of change, according to the IMCS, is similar to the one observed in previous studies with adults. Implications for future research and practice are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The outcome of the inductive decision -making process of the leading project management group (PMG) was the proposal to develop three modules, Human Resource Management and Knowledge Management, Quality Management and Intercultural management, each for 10 ECTS credits. As a result of the theoretical and organisational framework and analytical phase of the project, four strategies informed the development and implemen- tation of the modules: 1. Collaboration as a principle stemming from EU collaborative policy and receiving it’s expression on all implementation levels (designing the modules, modes of learning, delivering the modules, evaluation process). 2. Building on the Bologna process masters level framework to assure ap- propriate academic level of outputs. 3. Development of value -based leadership of students through transforma- tional learning in a cross -cultural setting and continual reflection of theory in practice. 4. Continual evaluation and feedback among teachers and students as a strategy to achieve a high quality programme. In the first phase of designing the modules the collaborative strategy in particular was applied, as each module was led by one university, but members from all other universities participated in the discussions and development of the mod- ules. The Bologna process masters level framework and related standards and guidelines informed the form and method of designing the modules.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bio-pedagogy is built on praxis, i.e. the interrelationship between reflection and innovative action where these two merge in the construction of senses to generate knowledge. Then, the following question arises: How is teaching understood? How can practice be renovated from the action-reflection-action in a recurring manner and in life itself? A way to search for those answers is the systematization of experiences –a modality of qualitative research. It promotes the transformation of a common practice, based on knowledge building by holistic approaches to the educational process complexity. The systematization of bio-pedagogical experiences involves self-organization, joy, uncertainty and passion; it respects freedom and autonomy, and generates relational spaces, which promote creative processes in learning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Alongside the developments in behavioural economics, the concept of nudge was introduced as an intervention able to guide individual behaviour towards better choices without using coercion or incentives. While behavioural teams were created inside governmental units and regulatory authorities, nudging emerged in regulatory discourse, being increasingly regarded as a regulatory instrument that could overcome the disadvantages of other tools. This thesis analyses the viability of incorporating nudges into regulation. In particular, it investigates the implications for regulators of bringing iterative experimental testing – a widespread nudge design methodology outside regulation – into their own design practices. Nudges outside regulation are routinely designed using experiments of all kinds. This thesis intends to answer whether design premises rooted in iterative experimentation are still valid in the regulatory space, an arena that nudging entered into and that is distinct from the one where it originally emerged. The design and provision of nudges using the premises of iterative experimental testing is possible, but at a cost and burden for regulatory nudge designers. Therefore, the thesis evaluates how this burden can be reduced, in particular how nudges can be feasibly designed and provided through regulation or, put differently, how to more efficiently design and provide nudging as a regulatory tool.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work deals with the development of calibration procedures and control systems to improve the performance and efficiency of modern spark ignition turbocharged engines. The algorithms developed are used to optimize and manage the spark advance and the air-to-fuel ratio to control the knock and the exhaust gas temperature at the turbine inlet. The described work falls within the activity that the research group started in the previous years with the industrial partner Ferrari S.p.a. . The first chapter deals with the development of a control-oriented engine simulator based on a neural network approach, with which the main combustion indexes can be simulated. The second chapter deals with the development of a procedure to calibrate offline the spark advance and the air-to-fuel ratio to run the engine under knock-limited conditions and with the maximum admissible exhaust gas temperature at the turbine inlet. This procedure is then converted into a model-based control system and validated with a Software in the Loop approach using the engine simulator developed in the first chapter. Finally, it is implemented in a rapid control prototyping hardware to manage the combustion in steady-state and transient operating conditions at the test bench. The third chapter deals with the study of an innovative and cheap sensor for the in-cylinder pressure measurement, which is a piezoelectric washer that can be installed between the spark plug and the engine head. The signal generated by this kind of sensor is studied, developing a specific algorithm to adjust the value of the knock index in real-time. Finally, with the engine simulator developed in the first chapter, it is demonstrated that the innovative sensor can be coupled with the control system described in the second chapter and that the performance obtained could be the same reachable with the standard in-cylinder pressure sensors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

La Tesi presenta un metodo per valutare il livello di reversibilità e circolarità dei componenti edili. Il concetto cardine dello studio è il Design for Disassembly (DfD), definibile come un approccio alla progettazione di organismi edilizi secondo criteri costruttivi volti a facilitare successivi cambiamenti e smantellamenti (completi o parziali), consentendo il riutilizzo dei componenti, al fine di ridurre l’impatto ambientale degli interventi. Attualmente, diverse ricerche in ambito scientifico si dedicano all’approfondimento di questa tematica, correlandola ad altri concetti come la metodologia del Building Information Modeling (BIM), che consente di digitalizzare il processo progettuale, la sua realizzazione e la sua gestione attraverso modelli. Dopo l’analisi dello stato dell’arte, il lavoro è giunto alla definizione di un insieme di parametri idonei per essere introdotti in un modello informativo, in grado di rappresentare la circolarità del componente in termini di DfD. Per ogni elemento del componente analizzato viene assegnato un valore numerico (variabile da 0,1 a 1) a ogni parametro. Tramite l’utilizzo di una formula elaborata nell’ambito di precedenti ricerche svolte dal Dipartimento di Architettura dell'Università di Bologna, opportunamente modificata, si ottiene un indice sintetico finale denominato “Express Building Circularity Indicators” (EBCI). Il metodo di analisi proposto come strumento a supporto del processo progettuale è stato validato tramite l’applicazione a diverse soluzioni di facciata per l’efficientamento energetico di un fabbricato selezionato come caso di studio italiano dal progetto Europeo Horizon 2020 "DRIVE 0 – Driving decarbonization of the EU building stock by enhancing a consumer centred and locally based circular renovation process". I risultati ottenuti hanno consentito di verificare la replicabilità del processo digitalizzato a diverse soluzioni costruttive e l’affidabilità del metodo di valutazione del grado di circolarità.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Split-plot design (SPD) and near-infrared chemical imaging were used to study the homogeneity of the drug paracetamol loaded in films and prepared from mixtures of the biocompatible polymers hydroxypropyl methylcellulose, polyvinylpyrrolidone, and polyethyleneglycol. The study was split into two parts: a partial least-squares (PLS) model was developed for a pixel-to-pixel quantification of the drug loaded into films. Afterwards, a SPD was developed to study the influence of the polymeric composition of films and the two process conditions related to their preparation (percentage of the drug in the formulations and curing temperature) on the homogeneity of the drug dispersed in the polymeric matrix. Chemical images of each formulation of the SPD were obtained by pixel-to-pixel predictions of the drug using the PLS model of the first part, and macropixel analyses were performed for each image to obtain the y-responses (homogeneity parameter). The design was modeled using PLS regression, allowing only the most relevant factors to remain in the final model. The interpretation of the SPD was enhanced by utilizing the orthogonal PLS algorithm, where the y-orthogonal variations in the design were separated from the y-correlated variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a rational approach to the design of a catamaran's hydrofoil applied within a modern context of multidisciplinary optimization. The approach used includes the use of response surfaces represented by neural networks and a distributed programming environment that increases the optimization speed. A rational approach to the problem simplifies the complex optimization model; when combined with the distributed dynamic training used for the response surfaces, this model increases the efficiency of the process. The results achieved using this approach have justified this publication.