872 resultados para Methodology of the conceptual elaboration ferreiriana


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of toll roads in Indonesia started around 1978. Initially, the management and development of toll roads sat directly under the Government of Indonesia (GoI) being undertaken through PT JasaMarga, a state owned enterprise specifically established to provide toll roads. Due to the slow growth and low capability of toll roads to fulfil infrastructure needs in the first ten years of operation (only 2.688kms/year), GoI changed its strategy in 1989 to one of using private sector participation for roads delivery through a Public Private Partnership (PPP) scheme. In this latter period, PT JasaMarga had two roles, both as regulator on behalf of the private sector as well as being the operator. However, from 1989 to 2004 the growth rate of toll roads actually decreased further to 2.300kms/year. Facing this challenge of low growth rate of toll roads, in 2004GoI changed the toll road management system and the role of regulator was returned to the Government through the establishment of the Toll Road Regulatory Agency (BPJT). GoI also amended the institutional framework to strengthen the toll road management system. Despite the introduction of this new institutional framework, the growth of toll roads still showed insignificant change. This problem in toll road development has generated an urgent need for research into this issue. The aim of the research is to understand the performance of the new institutional framework in enhancing PPP procured toll road development. The methodology of the research was to undertake a questionnaire survey distributed to private sector respondents involved in toll road development. The results of this study show that there are several problems inherent in the institutional framework, but the most significant problem comes from the uncertainty of the function of the strategic executive body in the land expropriation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This practice-based inquiry investigates the process of composing notated scores using improvised solos by saxophonists John Butcher and Anthony Braxton. To compose with these improvised sources, I developed a new method of analysis and through this method I developed new compositional techniques in applying these materials into a score. This method of analysis and composition utilizes the conceptual language of Gilles Deleuze and Felix Guattari found in A Thousand Plateaus. The conceptual language of Deleuze and Guattari, in particular the terms assemblage, refrain and deterritorialization are discussed in depth to give a context for the philosophical origins and also to explain how the language is used in reference to improvised music and the compositional process. The project seeks to elucidate the conceptual language through the creative practice and in turn for the creative practice to clarify the use of the conceptual terminology. The outcomes of the research resulted in four notated works being composed. Firstly, Gravity, for soloist and ensemble based on the improvisational language of John Butcher and secondly a series of 3 studies titled Transbraxton Studies for solo instruments based on the improvisational-compositional language of Anthony Braxton. The implications of this research include the application of the analysis method to a number of musical contexts including: to be used in the process of composing with improvised music; in the study of style and authorship in solo improvisation; as a way of analyzing group improvisation; in the analysis of textural music including electronic music; and in the analysis of music from different cultures—particularly cultures where improvisation and per formative aspects to the music are significant to the overall meaning of the work. The compositional technique that was developed has further applications in terms of an expressive method of composing with non-metered improvised materials and one that merges well with the transcription method developed of notating pitch and sounds to a timeline. It is hoped that this research can open further lines of enquiry into the application of the conceptual ideas of Deleuze and Guattari to the analysis of more forms of music.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Innovation is understood as the combination of existing ideas or the generation of new ideas into new processes, products and services, and widely viewed as the main driver of growth in contemporary economies. In the age of the knowledge economy, successful economic development is intimately linked to a country’s capacity to generate, acquire, absorb, disseminate, and apply innovation towards advanced technology products and services. This development approach is labelled as knowledge-based economic development and highly associated with a capacity embodied in a country’s national innovation ecosystem. The research reported in this paper aims to critically review the Australian innovation ecosystem in order to provide a better understanding on the potential impacts of policy and support mechanisms on the innovation and knowledge generation capacity. The investigation places Australia’s innovation system and national-level innovation support mechanisms under the microscope. The methodology of the study is twofold. Firstly, it undertakes a critical review of the literature and government policy documents to better understand the innovation policy and support mechanisms in the country. It, then, conducts a survey to capture Australian innovation companies’ perceptions on the role and effectiveness of the existing innovation incentive programs. The paper concludes with a discussion on the key insights and findings and potential policy and support directions of the country to achieve a flourishing knowledge economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has recently been proposed that the broad spectrum of interannual variability in the tropics with a peak around four years results from an interaction between the linear low-frequency oscillatory mode of the coupled system and the nonlinear higher-frequency modes of the system. In this study we determine the Lyapunov exponents of the conceptual model consisting of a nonlinear low-order model coupled to a linear oscillator for various values of the coupling constants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Spirituality is fundamental to all human beings, existing within a person, and developing until death. This research sought to operationalise spirituality in a sample of individuals with chronic illness. A review of the conceptual literature identified three dimensions of spirituality: connectedness, transcendence, and meaning in life. A review of the empirical literature identified one instrument that measures the three dimensions together. Yet, recent appraisals of this instrument highlighted issues with item formulation and limited evidence of reliability and validity. Aim: The aim of this research was to develop a theoretically-grounded instrument to measure spirituality – the Spirituality Instrument-27 (SpI-27). A secondary aim was to psychometrically evaluate this instrument in a sample of individuals with chronic illness (n=249). Methods: A two-phase design was adopted. Phase one consisted of the development of the SpI-27 based on item generation from a concept analysis, a literature review, and an instrument appraisal. The second phase established the psychometric properties of the instrument and included: a qualitative descriptive design to establish content validity; a pilot study to evaluate the mode of administration; and a descriptive correlational design to assess the instrument’s reliability and validity. Data were analysed using SPSS (Version 18). Results: Results of exploratory factor analysis concluded a final five-factor solution with 27 items. These five factors were labelled: Connectedness with Others, Self-Transcendence, Self-Cognisance, Conservationism, and Connectedness with a Higher Power. Cronbach’s alpha coefficients ranged from 0.823 to 0.911 for the five factors, and 0.904 for the overall scale, indicating high internal consistency. Paired-sample t-tests, intra-class correlations, and weighted kappa values supported the temporal stability of the instrument over 2 weeks. A significant positive correlation was found between the SpI-27 and the Spirituality Index of Well-Being, providing evidence for convergent validity. Conclusion: This research addresses a call for a theoretically-grounded instrument to measure spirituality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of the temperature and reaction time on the sulfation process of a dolomite is investigated in this paper. The sulfation effectiveness was evaluated and correlated with changes in the physical characteristics of a Brazilian dolomite during the reactive process. Calcination and sulfation experiments were performed under isothermal conditions for dolomite samples with average particle sizes of 545 mu m at temperatures of 750 degrees C, 850 degrees C and 950 degrees C at different times of sulfation. Thermogravimetric tests were applied to establish the reactivity variation of the dolomite in function of the time in the sulfation reaction and evaluate the methodology of the samples preparation. Porosimetry tests were performed to study the pore blockage of dolomite during the sulfation reaction. The highest values of BET surface area were 25.55 m(2)/g, 29.55 m(2)/g and 12.62 m(2)/g for calcined samples and after their sulfation processes, conversions of 51.5%, 61.9% and 42.8% were obtained at 750 degrees C, 850 degrees C and 950 degrees C, respectively. Considering the process as a whole, the best fit was provided by a first-order exponential decay equation. Moreover, the results have shown that it is possible to quantify the decreasing in the dolomite reactivity for sulfur dioxide sorption and understand the changes in the behavior of the sulfation process of limestones when applied to technologies, as fluidized bed combustor, in which sulfur dioxide is present. (C) 2011 Elsevier B. V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology of experimental simulation of state of spent nuclear fuel that occurs on the sea floor due to some catastrophes or dumping is developed. Data on long-term (more than 2000 days) experiments on estimation of 85Kr and 137Cs release rate from spent nuclear fuel (fragments of irradiated UO2 pellets) were firstly obtained; these estimates prove correctness of a hypothesis offered by us in early 1990s concerning to earlier 85Kr release (by one order of magnitude higher than that of 137Cs) as compared to other fission fragments in case of loss of integrity of fuel containment as a result of corrosion on the sea floor. A method and technique of onboard 85Kr and 137Cs sampling and extraction (as well as sampling of tritium, product of triple 235U fission) and their radiometric analysis at coastal laboratories are developed. Priority data on 85Kr background in bottom layers of the Barents and Kara Seas and 137Cs and 3H in these seas (state of 2003) are presented. Models necessary for estimation of dilution of fission products of spent nuclear fuel and their transport on the floor in accident and dumping regions are developed. An experimental method for examination of state of spent nuclear fuel on the sea floor (one expedition each 2-3 years) by 85Kr release into environment (a leak tracer) is proposed; this release is an indicator of destruction of fuel containment and release of products of spent nuclear fuel in case of 235UO2 corrosion in sea water.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper shows the results of the new steps that have been done in the development of the tidal energy converter GESMEY. These are the design, construction and trials into the sea of a 1/10 scale prototype and also the construction with the same scale of the buoy BOSCEM, that anchors the device and lets it in the correct work position and depth, along the two directions of the flow that the daily tidal cycle have. Inside the paper is described the objectives and the methodology of the experimental trials that were ca rry out the last summer with the scale prototype. GESMEY is a new type of tidal energy converter (TEC) that has the capability to exploit currents in waters over forty meters by itself and it gets only using its internal ballast system the necessary equilibrium between hy drostatics and hydrodynamics forces to make the emersion and the immersion procedures without any other help. Finally the paper shows the description of the results obtained over the performance of the devices along the immersion, emersion and floating transport manoeuvres and afterwards the results, that were obtained along the generation power tests that were carried out, are shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis trata sobre métodos de corrección que compensan la variación de las condiciones de iluminación en aplicaciones de imagen y video a color. Estas variaciones hacen que a menudo fallen aquellos algoritmos de visión artificial que utilizan características de color para describir los objetos. Se formulan tres preguntas de investigación que definen el marco de trabajo de esta tesis. La primera cuestión aborda las similitudes que se dan entre las imágenes de superficies adyacentes en relación a su comportamiento fotométrico. En base al análisis del modelo de formación de imágenes en situaciones dinámicas, esta tesis propone un modelo capaz de predecir las variaciones de color de la región de una determinada imagen a partir de las variaciones de las regiones colindantes. Dicho modelo se denomina Quotient Relational Model of Regions. Este modelo es válido cuando: las fuentes de luz iluminan todas las superficies incluídas en él; estas superficies están próximas entre sí y tienen orientaciones similares; y cuando son en su mayoría lambertianas. Bajo ciertas circunstancias, la respuesta fotométrica de una región se puede relacionar con el resto mediante una combinación lineal. No se ha podido encontrar en la literatura científica ningún trabajo previo que proponga este tipo de modelo relacional. La segunda cuestión va un paso más allá y se pregunta si estas similitudes se pueden utilizar para corregir variaciones fotométricas desconocidas en una región también desconocida, a partir de regiones conocidas adyacentes. Para ello, se propone un método llamado Linear Correction Mapping capaz de dar una respuesta afirmativa a esta cuestión bajo las circunstancias caracterizadas previamente. Para calcular los parámetros del modelo se requiere una etapa de entrenamiento previo. El método, que inicialmente funciona para una sola cámara, se amplía para funcionar en arquitecturas con varias cámaras sin solape entre sus campos visuales. Para ello, tan solo se necesitan varias muestras de imágenes del mismo objeto capturadas por todas las cámaras. Además, este método tiene en cuenta tanto las variaciones de iluminación, como los cambios en los parámetros de exposición de las cámaras. Todos los métodos de corrección de imagen fallan cuando la imagen del objeto que tiene que ser corregido está sobreexpuesta o cuando su relación señal a ruido es muy baja. Así, la tercera cuestión se refiere a si se puede establecer un proceso de control de la adquisición que permita obtener una exposición óptima cuando las condiciones de iluminación no están controladas. De este modo, se propone un método denominado Camera Exposure Control capaz de mantener una exposición adecuada siempre y cuando las variaciones de iluminación puedan recogerse dentro del margen dinámico de la cámara. Los métodos propuestos se evaluaron individualmente. La metodología llevada a cabo en los experimentos consistió en, primero, seleccionar algunos escenarios que cubrieran situaciones representativas donde los métodos fueran válidos teóricamente. El Linear Correction Mapping fue validado en tres aplicaciones de re-identificación de objetos (vehículos, caras y personas) que utilizaban como caracterísiticas la distribución de color de éstos. Por otra parte, el Camera Exposure Control se probó en un parking al aire libre. Además de esto, se definieron varios indicadores que permitieron comparar objetivamente los resultados de los métodos propuestos con otros métodos relevantes de corrección y auto exposición referidos en el estado del arte. Los resultados de la evaluación demostraron que los métodos propuestos mejoran los métodos comparados en la mayoría de las situaciones. Basándose en los resultados obtenidos, se puede decir que las respuestas a las preguntas de investigación planteadas son afirmativas, aunque en circunstancias limitadas. Esto quiere decir que, las hipótesis planteadas respecto a la predicción, la corrección basada en ésta y la auto exposición, son factibles en aquellas situaciones identificadas a lo largo de la tesis pero que, sin embargo, no se puede garantizar que se cumplan de manera general. Por otra parte, se señalan como trabajo de investigación futuro algunas cuestiones nuevas y retos científicos que aparecen a partir del trabajo presentado en esta tesis. ABSTRACT This thesis discusses the correction methods used to compensate the variation of lighting conditions in colour image and video applications. These variations are such that Computer Vision algorithms that use colour features to describe objects mostly fail. Three research questions are formulated that define the framework of the thesis. The first question addresses the similarities of the photometric behaviour between images of dissimilar adjacent surfaces. Based on the analysis of the image formation model in dynamic situations, this thesis proposes a model that predicts the colour variations of the region of an image from the variations of the surrounded regions. This proposed model is called the Quotient Relational Model of Regions. This model is valid when the light sources illuminate all of the surfaces included in the model; these surfaces are placed close each other, have similar orientations, and are primarily Lambertian. Under certain circumstances, a linear combination is established between the photometric responses of the regions. Previous work that proposed such a relational model was not found in the scientific literature. The second question examines whether those similarities could be used to correct the unknown photometric variations in an unknown region from the known adjacent regions. A method is proposed, called Linear Correction Mapping, which is capable of providing an affirmative answer under the circumstances previously characterised. A training stage is required to determine the parameters of the model. The method for single camera scenarios is extended to cover non-overlapping multi-camera architectures. To this extent, only several image samples of the same object acquired by all of the cameras are required. Furthermore, both the light variations and the changes in the camera exposure settings are covered by correction mapping. Every image correction method is unsuccessful when the image of the object to be corrected is overexposed or the signal-to-noise ratio is very low. Thus, the third question refers to the control of the acquisition process to obtain an optimal exposure in uncontrolled light conditions. A Camera Exposure Control method is proposed that is capable of holding a suitable exposure provided that the light variations can be collected within the dynamic range of the camera. Each one of the proposed methods was evaluated individually. The methodology of the experiments consisted of first selecting some scenarios that cover the representative situations for which the methods are theoretically valid. Linear Correction Mapping was validated using three object re-identification applications (vehicles, faces and persons) based on the object colour distributions. Camera Exposure Control was proved in an outdoor parking scenario. In addition, several performance indicators were defined to objectively compare the results with other relevant state of the art correction and auto-exposure methods. The results of the evaluation demonstrated that the proposed methods outperform the compared ones in the most situations. Based on the obtained results, the answers to the above-described research questions are affirmative in limited circumstances, that is, the hypothesis of the forecasting, the correction based on it, and the auto exposure are feasible in the situations identified in the thesis, although they cannot be guaranteed in general. Furthermore, the presented work raises new questions and scientific challenges, which are highlighted as future research work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cuando se trata de Rem Koolhaas, su espejo no refleja una sola imagen sino múltiples, es un prisma poliédrico. Su espejo nos devuelve el Rem mediático, el intelectual, el conceptualizador, el constructor, el analista, el periodista, el actor... En el caso de esta investigación, fijamos el punto de mira en el Rem COMUNICADOR. “Rem a los dos lados del espejo” se enmarca en una investigación sobre los medios de comunicación de arquitectura, su reflejo en la producción arquitectónica y viceversa. Se trata de llegar a discernir si comunicación y producción arquitectónica colisionan y confluyen en el caso de grandes comunicadores como Rem Koolhaas, si el mensaje y el medio transmisor adquieren las mismas cualidades. Centrándose en la figura de Rem Koolhaas, la tesis aborda la evolución de su faceta comunicativa y las transformaciones sucesivas en el campo de la comunicación arquitectónica, en paralelo a su evolución conceptual a lo largo de su trayectoria. La investigación, por tanto, no se centra tanto en su componente teórica o en la práctica arquitectónica de OMA, sino en la exposición de su producción al mundo, especialmente a través de sus ensayos y libros. “Delirious New York” y “SMLXL” son un reflejo del momento conceptual en que se inscriben, y contienen mucha información sobre los referentes gráficos que irremediablemente han influido en su composición. Especialmente, la aparición de “SMLXL” supuso un revulsivo para el mundo de la comunicación arquitectónica, porque puso el foco sobre la importancia de dejar atrás un discurso narrativo linea y unifocal, para afrontar la comunicación barajando múltiples variables, y aproximaciones, en un proceso similar al desarrollo de un proyecto de arquitectura. Presenta un diseño muy novedoso y una edición extremadamente cuidada, que atiende a parámetros mucho más ambiciosos que los meramente narrativos. Profundiza en la necesidad de una temática global, planteando cuál es la aproximación más apropiada para cada uno de los proyectos que describe, transmitiendo al lector una percepción más allá de lo estrictamente visual, más próximo a lo sensorial. Además, su enorme repercusión a nivel internacional y el gran interés que despertó (no solamente entre los arquitectos, sino también entre diseñadores gráficos, publicistas, personas provenientes de todo tipo de tendencias artísticas y público en general), provocó la globalización del fenómeno de las publicaciones arquitectónicas y puso de manifiesto la importancia de la comunicación como una disciplina en sí misma, dentro de la producción arquitectónica en la era actual. A pesar de la importancia de “SMLXL” a todos los niveles, la presente tesis plantea que, donde realmente se culmina esa experiencia comunicativa, es en “Content”, al incluir nuevos parámetros relacionados con la fusión conceptual de continente y contenido. Es en esta publicación donde el objeto de la comunicación y la expresión de la misma se convierten en un único elemento, que se rige por leyes similares. En este caso, la ley fundamental es la aplicación hasta sus máximas consecuencias de la “cultura de la congestión”, tanto en el mensaje como en el medio, generando lo que hemos convenido en denominar “comunicación congestiva”. Esta concepción deviene en que necesariamente se materialice como un producto efímero, desechable, casi virtual, porque responde a las condiciones de un momento muy concreto y específico y fuera de ese contexto pierde su significación.. La “cultura de la congestión” empieza a surgir en los planteamientos de Koolhaas en la Architectural Association School of Architecture de Londres, bajo la tutela de Elia Zenghelis. Posteriormente se desarrolla en su manifiesto retroactivo sobre Manhattan, “Delirious New York”, donde declara la guerra abierta al urbanismo del movimiento moderno y afirma que la ciudad realmente contemporánea es aquella que es fruto de un desarrollo no planificado, hiperdensa y posible gracias a los avances tecnológicos de su era. Finalmente comienza a materializarse en la Diploma Unit 9 de la AA, donde entra como profesor en 1975, dejando una huella indeleble en las generaciones posteriores de arquitectos que pasaron dicha unidad. Rem Koolhaas es ante todo un intelectual y por ello, todo el constructo teórico sobre la metrópolis comienza a reflejarse en su obra a través de OMA desde el comienzo de su producción. Podemos decir a grandes rasgos que su carrera está marcada por dos hitos históricos fundamentales que determinan tres etapas diferenciadas en su producción. En sus primeros años de profesión, Koolhaas sigue fascinado por la metrópolis urbana y la aplicación del método paranoico crítico a su producción arquitectónica. Es un arquitecto profundamente surrealista. Entiende este método como una estrategia de conocimiento y aproximación al mundo que le rodea: “dejar salir el inconsciente pero sostenerlo con las muletas de la racionalidad”. Pero lo que en realidad le interesa es su aplicación a la gran escala, el “Bigness”, y por ello, participa en proyectos muy ambiciosos de los que surgen conceptos que, más allá de resultar premiados o no, han dejado una huella ideológica en el devenir de la arquitectura. Entre estos proyectos, cabe destacar su propuesta para el Parque de la Villette o la Très Grande Bibliotèque de París. Sus proyectos de esta época destilan una gran carga conceptual, que devienen en unos interiores sorprendentes pero una apariencia exterior sobria o incluso podríamos decir "povera", por el uso de materiales efímeros, poco habituales en la macro-arquitectura hasta ese momento. Súbitamente, en 1997, explotó el denominado “Efecto Bilbao”, de la mano de Frank Gehry (1). El Museo Guggenheim de Bilbao, con su espectacularidad, sus formas pregnantes e imposibles, impacta al mundo. Nace la era de la “Arquitectura del Espectáculo”; la transformación de la ciudad a través de ICONOS que actúen como nodos de atracción y concentración en torno a los cuales supuestamente se revitaliza la actividad económica, cultural y sociopolítica de la ciudad, como si a través de un único gesto se pudieran regenerar todos los tejidos internos de la urbe. Rem Koolhaas comprende rápidamente que la aproximación a la ciudad ha cambiado y, sobre todo, el mercado. En el mundo de la globalización, la única manera de llegar a materializar el “Bigness”, es encerrando sus ejercicios intelectuales en formas pregnantes, bellas, icónicas, espectaculares. Koolhaas encuentra su marca personal en la estética “Stealth”, proveniente de los aviones de combate facetados para evitar los radares, elaborados en los años 80. De esta época surgen proyectos como la Casa da Música de Oporto o la Biblioteca de Seattle; ambos edificios son iconos facetados, de belleza pregnante, que dejan una huella indeleble en la ciudad y provocan, al igual que el Guggenheim, un cierto efecto de recuperación y revitalización en el entorno en que se asientan, al menos de manera temporal. En cualquier caso, Koolhaas nunca abandona los ejercicios meramente teóricos, pero segrega su actividad en dos: OMA produce aquello que tiene vocación de ser construido y se rige por los parámetros del mercado global y AMO, la otra cara del espejo de Rem, aplica el pensamiento arquitectónico a campos no explorados, sin la dependencia de agentes externos, pudiendo permitirse ser un laboratorio puramente experimental. En este escenario, llega el 11 de septiembre de 2001 y el ataque a las Torres Gemelas de Nueva York tiene efectos devastadores a todos los niveles, significando, en un período de tiempo sorprendentemente corto, un cambio en el orden mundial. Rem Koolhaas da entonces un giro de 180 grados, dirige su mirada hacia China, donde entiende que sus aportaciones tienen un beneficio social más directo que en occidente. (2) Para presentar al mundo su nuevo cambio de rumbo y la creación del “Think Tank” AMO, plantea una gran exposición en la NeueGallerie de Berlín bajo el título de “Content”, experiencia paralela a la edición del libro con el mismo título, que inicialmente nace como “catálogo de la exposición, pero que internamente siempre se concibió como el documento más trascendente en el estudio desde “SMLXL”. Sin embargo, en muchos aspectos se trata de su opuesto: una publicación con formato revista, de tapa blanda, con paginado muy fino, formato de "folleto de supermercado" y contenido hiperdenso. Es un experimento efímero, fugaz, ligero, barato, de “usar y tirar”. De hecho, está fuera de stock, ya no se edita. Probablemente Rem Koolhaas desaprobaría que se hiciera una investigación que pusiera el foco sobre el mismo, porque diez años después de su publicación seguramente opine que su vigencia ha caducado. Sin embargo, muestra con una claridad meridiana el estado conceptual y vital de OMA en el momento de su publicación y representa, además un verdadero hito en la comunicación arquitectónica, un punto de no retorno, el máximo exponente de lo que hemos denominado “comunicación congestiva”. La presente tesis plantea que “Content” contiene la esencia de la mayor aportación de Rem Koolhaas al mundo de la arquitectura: la transformación profunda y definitiva de la comunicación arquitectónica mediante la convergencia del estado conceptual y la transmisión del mismo. Su legado arquitectónico y conceptual ha marcado a todas las generaciones posteriores de manera indeleble. Sus ensayos, sus teorías, sus proyectos y sus edificaciones ya pertenecen a la historia de la arquitectura, sin ninguna duda. Pero es su revisión del concepto de la comunicación en arquitectura lo que ha tenido y tendrá un reflejo inmediato en las generaciones futuras, no solamente en la comunicación sino en su arquitectura, a través de un intercambio biyectivo. El planteamiento a futuro sería determinar qué sucede tras “Content”, tras la hiperdensidad máxima, tras la cultura de la congestión visual; qué es lo que propone Koolhaas y qué se va a plantear también en el mundo de la comunicación arquitectónica. Para ello, estudiaremos en profundidad sus últimos proyectos relacionados con la comunicación, como su propuesta para la Biennale de Arquitectura de Venecia de 2014, su intensa investigación sobre el “Metabolismo” en “Project Japan: Metabolism Talks...”, o la dirección de sus últimos planteamientos territoriales. En los últimos tiempos Rem Koolhaas habla de “Preservación”, de “Sobriedad”, de “Esencialismo”, de “Performance”... El autor intelectual de la cultura de la congestión habla ahora de la “low density”...como no podía ser de otra manera en la otra cara del espejo. En definitiva, el color blanco como suma de todos los colores, todas las longitudes de onda del espectro visible recibidas al tiempo. ABSTRACT When talking about Rem Koolhaas, the mirror does not only reflect one but numerous images: it is nothing but a polyhedral prism. His mirror gives us the image of Rem the media celebrity, the intellectual, the conceptualizer, the builder, the analyst, the journalist, the actor... This research sets the spotlight on Rem the COMMUNICATOR. "Rem on both sides of the mirror" belongs to a research on architectural media, its influence on the architectural production and vice versa. It is aimed at getting to discern whether communication and architectural production collide and converge in the case of great communicators such as Rem Koolhaas, and whether the message and transmission media acquire the same features. Focusing on the figure of Rem Koolhaas, this thesis addresses the evolution of his communicative facet and the successive transformations in the field of architectural communication, parallel to the conceptual evolution he underwent throughout his career. Therefore, this research is not so much focused on his theoretical component or on the OMA’s architectural practice, but on the exhibition of his production to the world, especially through his essays and books. "Delirious New York" and "SMLXL" hold up a mirror to the conceptual moment they are part of, and contain a great deal of information about the graphic references that have inevitably influenced his work. Specially, the launch of "SMLXL" was a salutary shock for the architectural communication world, since it set the spotlight on the importance of leaving a linear and unifocal narrative behind in order to face communication considering multiple variables and approaches, based on a process similar to the development of an architectural project. It offers a very innovative design and an extremely careful editing, which deals with parameters much more ambitious than those merely narrative. It explores the need for a global subject and suggests the most appropriate approach for each of the projects described, giving the reader a closer insight to the sensory that goes beyond what’s strictly visual. In addition, its huge international impact and the great interest shown, not only by architects but also by graphic designers, publishers, people from all kinds of artistic trends and the general public, led to the globalisation of the architectural publications phenomenon and brought the importance of communication as a discipline in itself, within the architectural production in the age at hand, to light. Despite the importance of "SMLXL" at all levels, this thesis suggests that the communication experience really culminates in "Content", for it includes new conceptual parameters associated with the container-content conceptual fusion. It is in this book where the purpose of communication and the expression of such become a single element, ruled by similar laws. In this particular case, the fundamental law is to implement the "culture of congestion" to its extreme consequences in both the message and the media, leading to what we have agreed to refer to as "congestive communication”. This concept leads to its inevitable materialisation into an ephemeral, disposable, almost virtual product, because it meets the conditions of a very concrete and specific time, and outside that context it loses its significance. The "culture of congestion" emerged in Koolhaas’ approaches under the guidance of Elia Zenghelis, in the Architectural Association School of Architecture of London. Subsequently, his retroactive manifesto on Manhattan, "Delirious New York" developed it, waging an all-out war against the modern movement urbanism and maintaining that the really contemporary cities are those hyperdense ones that rise as a result of an unplanned development and thanks to the typical technological advances of their time. Finally it began to materialise in the Diploma Unit 9 of the AA, in which he started lecturing in 1975, leaving an indelible mark on subsequent generations of architects who passed that unit. First and foremost, Rem Koolhaas is an intellectual and, therefore, all the theoretical construct in the metropolis began to be reflected in his work through OMA since the beginnings of his production. Broadly speaking, we can say that his career is influenced by two essential historic events, which determine three different stages in his production. In the early years of his career, Koolhaas was still fascinated by the urban metropolis and the implementation of the paranoiac-critical method to his architectural production. He was then a deeply surreal architect. He understood this method as a knowledge strategy and an approach to the world around him: "let the subconscious out but hold it with the crutches of reasonableness”. However, he was actually interested in its implementation on a broad scale, the "Bigness", and therefore, he took part in ambitious projects that led to the accrual of concepts that, beyond being rewarded, left an ideological impression on the evolution of architecture. These projects included his proposal for the Parc de la Villette or the Très Grande Bibliotèque in Paris. The projects he carried out during this period showed a great conceptual background, which evolved into surprising interiors but a sober, or even "povera", exterior appearance, thanks to the use of ephemeral materials that were atypical in the macro-architecture field until that moment. Suddenly, in 1997, the so-called "Bilbao effect" boomed thanks to Frank Gehry (1). The Guggenheim Museum of Bilbao amazed the world with its spectacular nature and its pregnant and impossible shapes. It was the beginning of the era ofThe architecture of spectacle”: the transformation of the city through ICONS that would act as nodes of attraction and gathering, around which the economic, cultural and socio-political activity of the city was supposed to be revitalized, as if through a single gesture all internal tissues of the city could be rebuilt. Rem Koolhaas quickly realized that the approach to the city, and especially to the global market, had changed. In the world of globalisation, the only way to get to materialise such "Bigness" was by keeping his intellectual exercises in pregnant, beautiful, iconic and spectacular shapes. Koolhaas found his personal brand in the Stealth aesthetic, resulting from the eighties American combat aircrafts whose shape was faceted in order to avoid radars. Projects such as the Casa da Música in Porto or the Seattle Library date from this period; both buildings are faceted icons of pregnant beauty that left an indelible mark on the city and caused, like the Guggenheim, some degree of recovery and revitalization on the environment in which they were based, at least temporarily. In any case, Koolhaas never gave the merely theoretical exercises up, but he segregated his work in two: OMA produced what was destined to be built and ruled by the parameters of the global market and AMO, Rem’s other side of the mirror, applied the architectural thought in unexplored fields, notwithstanding external agents and being able to work as a purely experimental laboratory. In light of this backdrop, September 11th 2001 came and the attacks on the Twin Towers in New York had devastating effects at all levels, leading to a change in the world order, in a surprisingly short period of time. Rem Koolhaas made a 180° turn directing his vision towards China, where he believed his contributions would have a more direct social benefit than in the Western world. (2) In order to introduce his new course of direction and the creation of the AMO "Think Tank", he planned a major exhibition in the Neue Nationalgalerie of Berlin under the title "Content", in parallel with edition of the book with the same title, which was at first the "exhibition catalog” but, deep down, was always conceived as the most important document of the Office since "SMLXL". However, in many ways it was just the opposite: a publication characterised by its magazine format, soft cover, very fine paging, "supermarket brochure" form and hyperdense content. It was an ephemeral, brief, light, cheap and "disposable" experiment. In fact, it is currently out of stock and out of print. Rem Koolhaas would probably disapprove of a research that sets the spotlight on him, for he would probably say that his validity has expired given that it has been ten years since its publication. However, it shows OMA’s conceptual and vital status at the time of its publication with crystalline clarity and it is also a true milestone in architectural communication. A point of no return. The epitome of the so-called "congestive communication ". This thesis suggests that "Content" contains the essence of Rem Koolhaas’ greatest contribution to the world of architecture: the deep and definitive transformation of architectural communication through the convergence of the conceptual state and the transmission thereof. His architectural and conceptual legacy has left an indelible mark on all subsequent generations. There is no doubt his essays, theories, projects and buildings already belong to the history of architecture. But it is his review on the concept of communication in architecture that has had and shall have an immediate influence on future generations, not only in their communication but also in their architecture, through a bijective exchange. Future approaches should try to determine what happens after "Content", after the maximum hyperdensity, after the visual culture of congestion; what shall Koolhaas suggest as well as what shall happen in the world of architectural communication. To this end, we shall study his latest communication-related projects, such as the design of the Venetian Architecture Biennale in 2014, his intensive research on the "Metabolism" in "Project Japan: Metabolism Talks ...", or the course of his latest territorial approaches in depth. Most recently, Rem Koolhaas has talked about "Preservation", "Sobriety" of "Essentialism", "Performance", etc. The mastermind of the culture of congestion now speaks of the "low density"... as it could not be otherwise, on the other side of the mirror. Summarizing, the white color as the sum of all colors; all wavelengths of the visible spectrum received at the same time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Island County is located in the Puget Sound of Washington State and includes several islands, the largest of which is Whidbey Island. Central Whidbey Island was chosen as the project site, as residents use groundwater for their water supply and seawater intrusion near the coast is known to contaminate this resource. In 1989, Island County adopted a Saltwater Intrusion Policy and used chloride concentrations in existing wells in order to define and map “risk zones.” In 2005, this method of defining vulnerability was updated with the use of water level elevations in conjunction with chloride concentrations. The result of this work was a revised map of seawater intrusion vulnerability that is currently in use by Island County. This groundwater management strategy is defined as trigger-level management and is largely a reactive tool. In order to evaluate trends in the hydrogeologic processes at the site, including seawater intrusion under sea level rise scenarios, this report presents a workflow where groundwater flow and discharge to the sea are quantified using a revised conceptual site model. The revised conceptual site model used several simplifying assumptions that allow for first-order quantitative predictions of seawater intrusion using analytical methods. Data from water well reports included lithologic and well construction information, static water levels, and aquifer tests for specific capacity. Results from specific capacity tests define the relationship between discharge and drawdown and were input for a modified Theis equation to solve for transmissivity (Arihood, 2009). Components of the conceptual site model were created in ArcGIS and included interpolation of water level elevation, creation of groundwater basins, and the calculation of net recharge and groundwater discharge for each basin. The revised conceptual site model was then used to hypothesize regarding hydrogeologic processes based on observed trends in groundwater flow. Hypotheses used to explain a reduction in aquifer thickness and hydraulic gradient were: (1) A large increase in transmissivity occurring near the coast. (2) The reduced aquifer thickness and hydraulic gradient were the result of seawater intrusion. (3) Data used to create the conceptual site model were insufficient to resolve trends in groundwater flow. For Hypothesis 2, analytical solutions for groundwater flow under Dupuit assumptions were applied in order to evaluate seawater intrusion under projected sea level rise scenarios. Results indicated that a rise in sea level has little impact on the position of a saltwater wedge; however, a reduction in recharge has significant consequences. Future work should evaluate groundwater flow using an expanded monitoring well network and aquifer recharge should be promoted by reducing surface water runoff.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the research is to develop an e-business selection framework for small and medium enterprises (SMEs) by integrating established techniques in planning. The research is case based, comprising four case studies carried out in the printing industry for the purpose of evaluating the framework. Two of the companies are from Singapore, while the other two are from Guangzhou, China and Jinan, China respectively. To determine the need of an e-business selection framework for SMEs, extensive literature reviews were carried out in the area of e-business, business planning frameworks, SMEs and the printing industry. An e-business selection framework is then proposed by integrating the three established techniques of the Balanced Scorecard (BSC), Value Chain Analysis (VCA) and Quality Function Deployment (QFD). The newly developed selection framework is pilot tested using a published case study before actual evaluation is carried out in four case study companies. The case study methodology was chosen because of its ability to integrate diverse data collection techniques required to generate the BSC, VCA and QFD for the selection framework. The findings of the case studies revealed that the three techniques of BSC, VCA and QFD can be integrated seamlessly to complement on each other’s strengths in e-business planning. The eight-step methodology of the selection framework can provide SMEs with a step-by-step approach to e-business through structured planning. Also, the project has also provided better understanding and deeper insights into SMEs in the printing industry.