911 resultados para architectural computation
Resumo:
The concept of space entered architectural history as late as 1893. Studies in art opened up the discussion, and it has been studied in various ways in architecture ever since. This article aims to instigate an additional reading to architectural history, one that is not supported by "isms" but based on space theories in the 20th century. Objectives of the article are to bring the concept of space and its changing paradigms to the attention of architectural researchers, to introduce a conceptual framework to classify and clarify theories of space, and to enrich the discussions on the 20th century architecture through theories that are beyond styles. The introduction of space in architecture will revolve around subject-object relationships, three-dimensionality and senses. Modern space will be discussed through concepts such as empathy, perception, abstraction, and geometry. A scientific approach will follow to study the concept of place through environment, event, behavior, and design methods. Finally, the research will look at contemporary approaches related to digitally supported space via concepts like reality-virtuality, mediated experience, and relationship with machines.
Resumo:
Architects use cycle-by-cycle simulation to evaluate design choices and understand tradeoffs and interactions among design parameters. Efficiently exploring exponential-size design spaces with many interacting parameters remains an open problem: the sheer number of experiments renders detailed simulation intractable. We attack this problem via an automated approach that builds accurate, confident predictive design-space models. We simulate sampled points, using the results to teach our models the function describing relationships among design parameters. The models produce highly accurate performance estimates for other points in the space, can be queried to predict performance impacts of architectural changes, and are very fast compared to simulation, enabling efficient discovery of tradeoffs among parameters in different regions. We validate our approach via sensitivity studies on memory hierarchy and CPU design spaces: our models generally predict IPC with only 1-2% error and reduce required simulation by two orders of magnitude. We also show the efficacy of our technique for exploring chip multiprocessor (CMP) design spaces: when trained on a 1% sample drawn from a CMP design space with 250K points and up to 55x performance swings among different system configurations, our models predict performance with only 4-5% error on average. Our approach combines with techniques to reduce time per simulation, achieving net time savings of three-four orders of magnitude. Copyright © 2006 ACM.
Resumo:
Efficiently exploring exponential-size architectural design spaces with many interacting parameters remains an open problem: the sheer number of experiments required renders detailed simulation intractable.We attack this via an automated approach that builds accurate predictive models. We simulate sampled points, using results to teach our models the function describing relationships among design parameters. The models can be queried and are very fast, enabling efficient design tradeoff discovery. We validate our approach via two uniprocessor sensitivity studies, predicting IPC with only 1–2% error. In an experimental study using the approach, training on 1% of a 250-K-point CMP design space allows our models to predict performance with only 4–5% error. Our predictive modeling combines well with techniques that reduce the time taken by each simulation experiment, achieving net time savings of three-four orders of magnitude.
Resumo:
We introduce a family of Hamiltonian systems for measurement-based quantum computation with continuous variables. The Hamiltonians (i) are quadratic, and therefore two body, (ii) are of short range, (iii) are frustration-free, and (iv) possess a constant energy gap proportional to the squared inverse of the squeezing. Their ground states are the celebrated Gaussian graph states, which are universal resources for quantum computation in the limit of infinite squeezing. These Hamiltonians constitute the basic ingredient for the adiabatic preparation of graph states and thus open new venues for the physical realization of continuous-variable quantum computing beyond the standard optical approaches. We characterize the correlations in these systems at thermal equilibrium. In particular, we prove that the correlations across any multipartition are contained exactly in its boundary, automatically yielding a correlation area law. © 2011 American Physical Society.
Resumo:
Architects typically interpret Heidegger to mean that dwelling in the Black Forest, was more authentic than living in an industrialised society however we cannot turn back the clock so we are confronted with the reality of modernisation. Since the Second World War production has shifted from material to immaterial assets. Increasingly place is believed to offer resistance to this fluidity, but this belief can conversely be viewed as expressing a sublimated anxiety about our role in the world – the need to create buildings that are self-consciously contextual suggests that we may no longer be rooted in material places, but in immaterial relations.
This issue has been pondered by David Harvey in his paper From Place to Space and Back Again where he argues that the role of place in legitimising identity is ultimately a political process, as the interpretation of its meaning is dependent on whose interpretation it is. Doreen Massey has found that different classes of people are more or less mobile and that mobility is related to class and education rather than to nationality or geography. These thinkers point to a different set of questions than the usual space/place divide – how can we begin to address the economic mediation of spatial production to develop an ethical production of place? Part of the answer is provided by the French architectural practice Lacaton Vassal in their book Plus. They ask themselves how to produce more space for the same cost so that people can enjoy a better quality of life. Another French practitioner, Patrick Bouchain, has argued that architect’s fees should be inversely proportional to the amount of material resources that they consume. These approaches use economics as a starting point for generating architectural form and point to more ethical possibilities for architectural practice
Resumo:
Particle-in-cell (PIC) simulations of relativistic shocks are in principle capable of predicting the spectra of photons that are radiated incoherently by the accelerated particles. The most direct method evaluates the spectrum using the fields given by the Lienard-Wiechart potentials. However, for relativistic particles this procedure is computationally expensive. Here we present an alternative method that uses the concept of the photon formation length. The algorithm is suitable for evaluating spectra both from particles moving in a specific realization of a turbulent electromagnetic field or from trajectories given as a finite, discrete time series by a PIC simulation. The main advantage of the method is that it identifies the intrinsic spectral features and filters out those that are artifacts of the limited time resolution and finite duration of input trajectories.
Resumo:
White household paints are commonly encountered as evidence in the forensic laboratory but they often cannot be readily distinguished by color alone so Fourier transform infrared (FT-IR) microscopy is used since it can sometimes discriminate between paints prepared with different organic resins. Here we report the first comparative study of FT-IR and Raman spectroscopy for forensic analysis of white paint. Both techniques allowed the 51 white paint samples in the study to be classified by inspection as either belonging to distinct groups or as unique samples. FT-IR gave five groups and four unique samples; Raman gave seven groups and six unique samples. The basis for this discrimination was the type of resin and/ or inorganic pigments/extenders present. Although this allowed approximately half of the white paints to be distinguished by inspection, the other half were all based on a similar resin and did not contain the distinctive modifiers/pigments and extenders that allowed the other samples to be identified. The experimental uncertainty in the relative band intensities measured using FT-IR was similar to the variation within this large group, so no further discrimination was possible. However, the variation in the Raman spectra was larger than the uncertainty, which allowed the large group to be divided into three subgroups and four distinct spectra, based on relative band intensities. The combination of increased discrimination and higher sample throughput means that the Raman method is superior to FT-IR for samples of this type. © 2005 Society for Applied Spectroscopy.
Resumo:
This paper proposes a method to assess the small signal stability of a power system network by selective determination of the modal eigenvalues. This uses an accelerating polynomial transform, designed using approximate eigenvalues
obtained from a wavelet approximation. Application to the IEEE 14 bus network model produced computational savings of 20%,over the QR algorithm.
Resumo:
This paper introduces an algorithm that calculates the dominant eigenvalues (in terms of system stability) of a linear model and neglects the exact computation of the non-dominant eigenvalues. The method estimates all of the eigenvalues using wavelet based compression techniques. These estimates are used to find a suitable invariant subspace such that projection by this subspace will provide one containing the eigenvalues of interest. The proposed algorithm is exemplified by application to a power system model.
Resumo:
Inter-component communication has always been of great importance in the design of software architectures and connectors have been considered as first-class entities in many approaches [1][2][3]. We present a novel architectural style that is derived from the well-established domain of computer networks. The style adopts the inter-component communication protocol in a novel way that allows large scale software reuse. It mainly targets real-time, distributed, concurrent, and heterogeneous systems.
Resumo:
The main aim of this study is to investigate the consequences of cross-cultural adjustment in an under researched sample of British expatriates working on International Architectural, Engineering and Construction (AEC) assignments. Adjustment is the primary outcome of an expatriate assignment. According to Bhaskar-Srinivas et al., (2005), Harrison et al., (2004) it is viewed to affect other work related outcomes which could eventually predict expatriate success. To address the scarcity of literature on expatriate management in the AEC sector, an exploratory design was adopted. Phase one is characterised by extensive review of extant literature, whereas phase two was qualitative exploration from British expatriates’ perspective; here seven unstructured interviews were carried out. Further, cognitive mapping analysis through Banaxia decision explorer software was conducted to develop a theoretical framework and propose various hypotheses. The findings imply that British AEC firms could sustain their already established competitive advantage in the global marketplace by acknowledging the complexity of international assignments, prioritising expatriate management and offering a well-rounded support to facilitate expatriate adjustment and ultimately achieve critical outcomes like performance, assignment completion and job satisfaction.