941 resultados para Parallel building blocks


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chordomas are very rare bone malignant tumours that have had a shortage of effective treatments for a long time. New treatments are now available for both the local and the metastatic phase of the disease, but the degree of uncertainty in selecting the most appropriate treatment remains high and their adoption remains inconsistent across the world, resulting in suboptimum outcomes for many patients. In December, 2013, the European Society for Medical Oncology (ESMO) convened a consensus meeting to update its clinical practice guidelines on sarcomas. ESMO also hosted a parallel consensus meeting on chordoma that included more than 40 chordoma experts from several disciplines and from both sides of the Atlantic, with the contribution and sponsorship of the Chordoma Foundation, a global patient advocacy group. The consensus reached at that meeting is shown in this position paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we question the homogeneity of a large parallel corpus by measuring the similarity between various sub-parts. We compare results obtained using a general measure of lexical similarity based on χ2 and by counting the number of discourse connectives. We argue that discourse connectives provide a more sensitive measure, revealing differences that are not visible with the general measure. We also provide evidence for the existence of specific characteristics defining translated texts as opposed to non-translated ones, due to a universal tendency for explicitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines the problems found in the parallelization of SPH (Smoothed Particle Hydrodynamics) algorithms using Graphics Processing Units. Different results of some parallel GPU implementations in terms of the speed-up and the scalability compared to the CPU sequential codes are shown. The most problematic stage in the GPU-SPH algorithms is the one responsible for locating neighboring particles and building the vectors where this information is stored, since these specific algorithms raise many dificulties for a data-level parallelization. Because of the fact that the neighbor location using linked lists does not show enough data-level parallelism, two new approaches have been pro- posed to minimize bank conflicts in the writing and subsequent reading of the neighbor lists. The first strategy proposes an efficient coordination between CPU-GPU, using GPU algorithms for those stages that allow a straight forward parallelization, and sequential CPU algorithms for those instructions that involve some kind of vector reduction. This coordination provides a relatively orderly reading of the neighbor lists in the interactions stage, achieving a speed-up factor of x47 in this stage. However, since the construction of the neighbor lists is quite expensive, it is achieved an overall speed-up of x41. The second strategy seeks to maximize the use of the GPU in the neighbor's location process by executing a specific vector sorting algorithm that allows some data-level parallelism. Al- though this strategy has succeeded in improving the speed-up on the stage of neighboring location, the global speed-up on the interactions stage falls, due to inefficient reading of the neighbor vectors. Some changes to these strategies are proposed, aimed at maximizing the computational load of the GPU and using the GPU texture-units, in order to reach the maximum speed-up for such codes. Different practical applications have been added to the mentioned GPU codes. First, the classical dam-break problem is studied. Second, the wave impact of the sloshing fluid contained in LNG vessel tanks is also simulated as a practical example of particle methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objetivo de la presente investigación es el desarrollo de un modelo de cálculo rápido, eficiente y preciso, para la estimación de los costes finales de construcción, en las fases preliminares del proyecto arquitectónico. Se trata de una herramienta a utilizar durante el proceso de elaboración de estudios previos, anteproyecto y proyecto básico, no siendo por tanto preciso para calcular el “predimensionado de costes” disponer de la total definición grafica y literal del proyecto. Se parte de la hipótesis de que en la aplicación práctica del modelo no se producirán desviaciones superiores al 10 % sobre el coste final de la obra proyectada. Para ello se formulan en el modelo de predimensionado cinco niveles de estimación de costes, de menor a mayor definición conceptual y gráfica del proyecto arquitectónico. Los cinco niveles de cálculo son: dos que toman como referencia los valores “exógenos” de venta de las viviendas (promoción inicial y promoción básica) y tres basados en cálculos de costes “endógenos” de la obra proyectada (estudios previos, anteproyecto y proyecto básico). El primer nivel de estimación de carácter “exógeno” (nivel .1), se calcula en base a la valoración de mercado de la promoción inmobiliaria y a su porcentaje de repercusión de suelo sobre el valor de venta de las viviendas. El quinto nivel de valoración, también de carácter “exógeno” (nivel .5), se calcula a partir del contraste entre el valor externo básico de mercado, los costes de construcción y los gastos de promoción estimados de la obra proyectada. Este contraste entre la “repercusión del coste de construcción” y el valor de mercado, supone una innovación respecto a los modelos de predimensionado de costes existentes, como proceso metodológico de verificación y validación extrínseca, de la precisión y validez de las estimaciones resultantes de la aplicación práctica del modelo, que se denomina Pcr.5n (Predimensionado costes de referencia con .5niveles de cálculo según fase de definición proyectual / ideación arquitectónica). Los otros tres niveles de predimensionado de costes de construcción “endógenos”, se estiman mediante cálculos analíticos internos por unidades de obra y cálculos sintéticos por sistemas constructivos y espacios funcionales, lo que se lleva a cabo en las etapas iniciales del proyecto correspondientes a estudios previos (nivel .2), anteproyecto (nivel .3) y proyecto básico (nivel .4). Estos cálculos teóricos internos son finalmente evaluados y validados mediante la aplicación práctica del modelo en obras de edificación residencial, de las que se conocen sus costes reales de liquidación final de obra. Según va evolucionando y se incrementa el nivel de definición y desarrollo del proyecto, desde los estudios previos hasta el proyecto básico, el cálculo se va perfeccionando en su nivel de eficiencia y precisión de la estimación, según la metodología aplicada: [aproximaciones sucesivas en intervalos finitos], siendo la hipótesis básica como anteriormente se ha avanzado, lograr una desviación máxima de una décima parte en el cálculo estimativo del predimensionado del coste real de obra. El cálculo del coste de ejecución material de la obra, se desarrolla en base a parámetros cúbicos funcionales “tridimensionales” del espacio proyectado y parámetros métricos constructivos “bidimensionales” de la envolvente exterior de cubierta/fachada y de la huella del edificio sobre el terreno. Los costes funcionales y constructivos se ponderan en cada fase del proceso de cálculo con sus parámetros “temáticos/específicos” de gestión (Pg), proyecto (Pp) y ejecución (Pe) de la concreta obra presupuestada, para finalmente estimar el coste de construcción por contrata, como resultado de incrementar al coste de ejecución material el porcentaje correspondiente al parámetro temático/especifico de la obra proyectada. El modelo de predimensionado de costes de construcción Pcr.5n, será una herramienta de gran interés y utilidad en el ámbito profesional, para la estimación del coste correspondiente al Proyecto Básico previsto en el marco técnico y legal de aplicación. Según el Anejo I del Código Técnico de la Edificación (CTE), es de obligado cumplimiento que el proyecto básico contenga una “Valoración aproximada de la ejecución material de la obra proyectada por capítulos”, es decir , que el Proyecto Básico ha de contener al menos un “presupuesto aproximado”, por capítulos, oficios ó tecnologías. El referido cálculo aproximado del presupuesto en el Proyecto Básico, necesariamente se ha de realizar mediante la técnica del predimensionado de costes, dado que en esta fase del proyecto arquitectónico aún no se dispone de cálculos de estructura, planos de acondicionamiento e instalaciones, ni de la resolución constructiva de la envolvente, por cuanto no se han desarrollado las especificaciones propias del posterior proyecto de ejecución. Esta estimación aproximada del coste de la obra, es sencilla de calcular mediante la aplicación práctica del modelo desarrollado, y ello tanto para estudiantes como para profesionales del sector de la construcción. Como se contiene y justifica en el presente trabajo, la aplicación práctica del modelo para el cálculo de costes en las fases preliminares del proyecto, es rápida y certera, siendo de sencilla aplicación tanto en vivienda unifamiliar (aisladas y pareadas), como en viviendas colectivas (bloques y manzanas). También, el modelo es de aplicación en el ámbito de la valoración inmobiliaria, tasaciones, análisis de viabilidad económica de promociones inmobiliarias, estimación de costes de obras terminadas y en general, cuando no se dispone del proyecto de ejecución y sea preciso calcular los costes de construcción de las obras proyectadas. Además, el modelo puede ser de aplicación para el chequeo de presupuestos calculados por el método analítico tradicional (estado de mediciones pormenorizadas por sus precios unitarios y costes descompuestos), tanto en obras de iniciativa privada como en obras promovidas por las Administraciones Públicas. Por último, como líneas abiertas a futuras investigaciones, el modelo de “predimensionado costes de referencia 5 niveles de cálculo”, se podría adaptar y aplicar para otros usos y tipologías diferentes a la residencial, como edificios de equipamientos y dotaciones públicas, valoración de edificios históricos, obras de urbanización interior y exterior de parcela, proyectos de parques y jardines, etc….. Estas lineas de investigación suponen trabajos paralelos al aquí desarrollado, y que a modo de avance parcial se recogen en las comunicaciones presentadas en los Congresos internacionales Scieconf/Junio 2013, Rics‐Cobra/Septiembre 2013 y en el IV Congreso nacional de patología en la edificación‐Ucam/Abril 2014. ABSTRACT The aim of this research is to develop a fast, efficient and accurate calculation model to estimate the final costs of construction, during the preliminary stages of the architectural project. It is a tool to be used during the preliminary study process, drafting and basic project. It is not therefore necessary to have the exact, graphic definition of the project in order to be able to calculate the cost‐scaling. It is assumed that no deviation 10% higher than the final cost of the projected work will occur during the implementation. To that purpose five levels of cost estimation are formulated in the scaling model, from a lower to a higher conceptual and graphic definition of the architectural project. The five calculation levels are: two that take as point of reference the ”exogenous” values of house sales (initial development and basic development), and three based on calculation of endogenous costs (preliminary study, drafting and basic project). The first ”exogenous” estimation level (level.1) is calculated over the market valuation of real estate development and the proportion the cost of land has over the value of the houses. The fifth level of valuation, also an ”exogenous” one (level.5) is calculated from the contrast between the basic external market value, the construction costs, and the estimated development costs of the projected work. This contrast between the ”repercussions of construction costs” and the market value is an innovation regarding the existing cost‐scaling models, as a methodological process of extrinsic verification and validation, of the accuracy and validity of the estimations obtained from the implementation of the model, which is called Pcr.5n (reference cost‐scaling with .5calculation levels according to the stage of project definition/ architectural conceptualization) The other three levels of “endogenous” construction cost‐scaling are estimated from internal analytical calculations by project units and synthetic calculations by construction systems and functional spaces. This is performed during the initial stages of the project corresponding to preliminary study process (level.2), drafting (level.3) and basic project (level.4). These theoretical internal calculations are finally evaluated and validated via implementation of the model in residential buildings, whose real costs on final payment of the works are known. As the level of definition and development of the project evolves, from preliminary study to basic project, the calculation improves in its level of efficiency and estimation accuracy, following the applied methodology: [successive approximations at finite intervals]. The basic hypothesis as above has been made, achieving a maximum deviation of one tenth, in the estimated calculation of the true cost of predimensioning work. The cost calculation for material execution of the works is developed from functional “three‐dimensional” cubic parameters for the planned space and constructive “two dimensional” metric parameters for the surface that envelopes around the facade and the building’s footprint on the plot. The functional and building costs are analyzed at every stage of the process of calculation with “thematic/specific” parameters of management (Pg), project (Pp) and execution (Pe) of the estimated work in question, and finally the cost of contractual construction is estimated, as a consequence of increasing the cost of material execution with the percentage pertaining to the thematic/specific parameter of the projected work. The construction cost‐scaling Pcr.5n model will be a useful tool of great interest in the professional field to estimate the cost of the Basic Project as prescribed in the technical and legal framework of application. According to the appendix of the Technical Building Code (CTE), it is compulsory that the basic project contains an “approximate valuation of the material execution of the work, projected by chapters”, that is, that the basic project must contain at least an “approximate estimate” by chapter, trade or technology. This approximate estimate in the Basic Project is to be performed through the cost‐scaling technique, given that structural calculations, reconditioning plans and definitive contruction details of the envelope are still not available at this stage of the architectural project, insofar as specifications pertaining to the later project have not yet been developed. This approximate estimate of the cost of the works is easy to calculate through the implementation of the given model, both for students and professionals of the building sector. As explained and justified in this work, the implementation of the model for cost‐scaling during the preliminary stage is fast and accurate, as well as easy to apply both in single‐family houses (detached and semi‐detached) and collective housing (blocks). The model can also be applied in the field of the real‐estate valuation, official appraisal, analysis of the economic viability of real estate developments, estimate of the cost of finished projects and, generally, when an implementation project is not available and it is necessary to calculate the building costs of the projected works. The model can also be applied to check estimates calculated by the traditional analytical method (state of measurements broken down into price per unit cost details), both in private works and those promoted by Public Authorities. Finally, as potential lines for future research, the “five levels of calculation cost‐scaling model”, could be adapted and applied to purposes and typologies other than the residential one, such as service buildings and public facilities, valuation of historical buildings, interior and exterior development works, park and garden planning, etc… These lines of investigation are parallel to this one and, by way of a preview, can be found in the dissertations given in the International Congresses Scieconf/June 2013, Rics‐Cobra/September 2013 and in the IV Congress on building pathology ‐Ucam/April 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tesis se compone de una primera parte introductoria, en la que se recogen las distintas opiniones y definiciones de la arquitectura “popular”, el estado de la cuestión, comentando los artículos y publicaciones realizados sobre la Mancha. La segunda parte profundiza en aspectos generales previos al análisis edificatorio central de la tesis, con los siguientes capítulos: -Estudio de los condicionantes físicos, históricos, socio-económicos y culturales de la comarca de la Mancha Baja. Acotando el territorio. -Una visión general sobre la arquitectura tradicional de la provincia de Ciudad Real, por comarcas. -Un estudio de las distintas tipologías edificatorias tradicionales, con ejemplos en la comarca manchega. -El análisis de materiales constructivos, elementos y sistemas utilizados en las construcciones tradicionales en la Mancha Baja. La tercera parte, desde la premisa de la representación gráfica, apoyado en un anexo con dibujos de ciento treinta y siete edificios populares de Manzanares y comarca, estudia: El trazado urbano y las casas de Manzanares; desde los levantamientos de plantas, alzados y secciones, emplazamiento en la manzana y fotografías, se realiza una descripción completa, con noventa y seis ejemplos. Además de llegar a las conclusiones derivadas del análisis de estas edificaciones, los objetivos pretendidos con este estudio serían también: Realizar un primer trabajo aproximativo, desde la visión arquitectónica, de la arquitectura tradicional manchega. Recopilar toda la información existente que pueda relacionarse con la arquitectura popular en la comarca, y citar los escritos y publicaciones de referencia para posteriores estudios. Se estudia la geomorfología, el clima, el territorio, la economía, la sociología, etc…, para obtener una información clave, además de los materiales, técnicas constructivas y morfología de las edificaciones. Se destaca el apartado de los edificios preindustriales tradicionales, como molinos de viento, de agua, palomares, pósitos y bodegas con el análisis de varios ejemplos, por su importante presencia en las poblaciones. Por último se desarrolla un amplio bloque sobre bibliografía de arquitectura popular, la consultada y la general. La arquitectura popular de la mancha baja es tapial cubierto de teja árabe, cerrada al exterior, pero abierta a grandes patios, de planta baja y cámaras altas, con elementos auxiliares de protección y acceso, que revisten la aparente simplicidad volumétrica de estos complejos, viviendas-almacén. Con un complejo programa tanto agrícola como doméstico. De gran protección frente al clima, con escasa decoración, esquemas espaciales primitivos y con mayor envergadura estructural en las dependencias agropecuarias. Una arquitectura que mezcla el uso doméstico y el productivo, pero que al evolucionar aumenta su diferenciación. Edificios que mantienen las mismas cualidades estéticas, repitiendo formas y volúmenes, pero de peculiares configuraciones espaciales, se repiten los materiales y técnicas constructivas, así como elementos arquitectónicos con pocas variaciones, pero no existen dos conjuntos similares. No podemos utilizar un ejemplo como modelo de casa manchega. Evoluciona de la casa bloque, básica y primitiva, con ejemplos escasos en las poblaciones más deprimidas, a la casa compleja, donde se separan con claridad las dependencias agropecuarias de las vivideras. Evoluciona de una casa rural, con los mismos esquemas, ya se ubique en el campo o en núcleos de población, a la casa urbana, entre medianerías, en la que se puede encontrar una transformación paralela, desarrollándose programas domésticos, más especializados, mezclados con arquitecturas cultas, con programas que reflejan las nuevas necesidades de la sociedad urbana del siglo XX. ABSTRACT The thesis is composed of a first part that is collected as introducing different views and definitions of popular architecture, the state of affairs, commenting on articles and publications carried out at the Mancha. The second part explores general issues before the main urban analysis of the thesis, with the following chapters: -A study of the geographic, historical, socio-economic and cultural conditions of the region of the Mancha Baja. Delimiting the territory -A tour with an overview of the province of Ciudad Real by regions. -A study of the different traditional building types, with examples in the region from the Mancha. -The Analysis of building materials, components and systems used in traditional buildings in the Mancha Lower The third part studies from the premise of the drawing: The urban planning of the towns to study and houses of Manzanares, from the execution of plans, elevations and sections, sites in the blocks, old photographs, a full description is made, covering a wide range of examples, highlighting the “evolution during the twentieth century, in its last quarter, buildings of popular character “, which is the ultimate aim of the thesis. In addition to reaching the conclusions drawn from the analysis cards of these buildings, the objectives pursued with this study would be also: This paper is the realization of a first rough work from the architectural vision of traditional architecture from the Mancha. To Search a work method for approaching the popular architecture, other than those made so far by other studies of historians, engineers and sociologists, with the graphical representation and the buildings would be studied like living organisms that evolve over time. To collect all the current information that It can be able to connect itself with the popular architecture in the region, and cite the writings and publications of reference for future studies. Geomorphology, climate, topography of the place is studied to obtain a key information about materials, construction techniques and morphology of the buildings. A section is opened to study the case of traditional industrial buildings like windmills, flour mill, pigeon lofts, public granary, threshing floor and cellars with the analysis of several examples; its importance is highlighted in the urban plan of the town. Finally a large block of popular literature on architecture is developed, consulted for work is distinct from the general existing on the subject. The popular architecture from the Mancha is built of rammed earth and roofs inclined of Arabic tiles, the buildings are closed to the outside, but they are open around large courtyards, and ground floor and camera high, with additional elements of protection, they are opened to patios. The manor has a complex program on agricultural and domestic activity. Large climate protection, poor decoration, quite primitive in shaping living spaces, and more structural scale in storage and processing units of agriculture-related products, mainly wine, cereal and to a lesser extent oil. These architecture combines the domestic and productive use, but which will evolve and they are distinguishing, both enclosed spaces such as courtyards. The buildings keep the same aesthetic qualities because they repeat shapes and volumes, but they maintain their spatial configuration individually; the materials, building techniques and architectural elements are repeated with slight variations, but there aren´t two identical houses. This architecture evolved from the block, basic and primitive house, with few examples in the most deprived towns, to the complex house, where agricultural units are clearly separated of domestic rooms. It developed from a country house (with the same patterns) whether it is located in the countryside or in the towns, to an urban house, in which we can find a parallel transformation, developing domestic programs, more specialized, mixed with cultivated architectures, with programs that reflect the changing needs of urban society of the twentieth century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although specific proteinases play a critical role in the active phase of apoptosis, their substrates are largely unknown. We previously identified poly(ADP-ribose) polymerase (PARP) as an apoptosis-associated substrate for proteinase(s) related to interleukin 1 beta-converting enzyme (ICE). Now we have used a cell-free system to characterize proteinase(s) that cleave the nuclear lamins during apoptosis. Lamin cleavage during apoptosis requires the action of a second ICE-like enyzme, which exhibits kinetics of cleavage and a profile of sensitivity to specific inhibitors that is distinct from the PARP proteinase. Thus, multiple ICE-like enzymes are required for apoptotic events in these cell-free extracts. Inhibition of the lamin proteinase with tosyllysine "chloromethyl ketone" blocks nuclear apoptosis prior to the packaging of condensed chromatin into apoptotic bodies. Under these conditions, the nuclear DNA is fully cleaved to a nucleosomal ladder. Our studies reveal that the lamin proteinase and the fragmentation nuclease function in independent parallel pathways during the final stages of apoptotic execution. Neither pathway alone is sufficient for completion of nuclear apoptosis. Instead, the various activities cooperate to drive the disassembly of the nucleus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the insight gained from 2-D particle models, and given that the dynamics of crustal faults occur in 3-D space, the question remains, how do the 3-D fault gouge dynamics differ from those in 2-D? Traditionally, 2-D modeling has been preferred over 3-D simulations because of the computational cost of solving 3-D problems. However, modern high performance computing architectures, combined with a parallel implementation of the Lattice Solid Model (LSM), provide the opportunity to explore 3-D fault micro-mechanics and to advance understanding of effective constitutive relations of fault gouge layers. In this paper, macroscopic friction values from 2-D and 3-D LSM simulations, performed on an SGI Altix 3700 super-cluster, are compared. Two rectangular elastic blocks of bonded particles, with a rough fault plane and separated by a region of randomly sized non-bonded gouge particles, are sheared in opposite directions by normally-loaded driving plates. The results demonstrate that the gouge particles in the 3-D models undergo significant out-of-plane motion during shear. The 3-D models also exhibit a higher mean macroscopic friction than the 2-D models for varying values of interparticle friction. 2-D LSM gouge models have previously been shown to exhibit accelerating energy release in simulated earthquake cycles, supporting the Critical Point hypothesis. The 3-D models are shown to also display accelerating energy release, and good fits of power law time-to-failure functions to the cumulative energy release are obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this Thesis, details of a proposed method for the elastic-plastic failure load analysis of complete building structures are given. In order to handle the problem, a computer programme in Atlas Autocode is produced. The structures consist of a number of parallel shear walls and intermediate frames connected by floor slabs. The results of an experimental investigation are given to verify the theoretical results and to demonstrate various factors that may influence the behaviour of these structures. Large full scale practical structures are also analysed by the proposed method and suggestions are made for achieving design economy as well as for extending research in various aspects of this field. The existing programme for elastic-plastic analysis of large frames is modified to allow for the effect of composite action of structural members, i.e. reinforced concrete floor slabs and the supporting steel beams. This modified programme is used to analyse some framed type structures with composite action as well as those which incorporate plates and shear walls. The results obtained are studied to ascertain the influence of composite action and other factors on the load carrying capacity of both bare frames and complete building structures. The theoretical failure load presented in this thesis does not predict the overall failure load of the structure nor does it predict the partial failure load of the shear walls and slabs but it merely predicts the partial failure load of a single frame and assumes that the loss of stiffess of such a frame renders the overall structure unusable. For most structures the analysis proposed in this thesis is likely to break down prematurely due to the failure of the slab and shear wall system and this factor must be taken into account in any future work on such structures. The experimental work reported in this thesis is acknowledged to be unsatisfactory as a verification of the limited theory proposed. In particular perspex was not found to be a suitable material for testing at high loads, micro-concrete may be more suitable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ukraine belongs among those young countries where the beginnings of democratisation and nation-building approximately coincided. While the development of nation states in Central Europe was usually preceded by the development of nations, the biggest dilemma in the Ukraine is whether a nation-state programme — parallel to the aim of state-building — is able to bring unfinished nation-building to completion. Ukraine sways between the EU and Russia with enormous amplitude. The alternating orientation between the West and the East can be ascribed to superpower ambitions reaching beyond Ukraine. Eventually, internal and external determinants are intertwined and mutually interact with one another. The aim of the paper is to explain the dilemmas arising from identity problems behind the Ukraine’s internal and external orientation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores the relationship of architecture and water through the design of an urban spa that offers both a bodily and a poetic experience of water. Research included investigation of recent architectural projects that enhance and order the view, sound, and touch of water as well as projects that integrate fountains, showers and reflecting pools into the experience of a building. In the design of the spa, the movement of water was based metaphorically on the natural water cycle: evaporation, condensation and collection of water in pools. The building presents fountains, rivulets, and pools in a descending sequence that represents the natural flow of water. The temperature of water and the activities of the spa follow the same descending sequence, progressing from a warm water bath at the top of the building to cool swimming pool at the plaza level in a contemporary interpretation of the experience of a Roman Bath.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.