324 resultados para Blueprint
Resumo:
El proyecto consiste en el diseño y estudio de un software cuyas prestaciones estén orientadas a gestionar una simulación de un sistema de radar. El prototipo de este entorno de simulación se ha realizado en el lenguaje Matlab debido a que inicialmente se considera el más adecuado para el tratamiento de las señales que los sistemas de radar manejan para realizar sus cálculos. Se ha escogido como modelo el software desarrollado por la compañía SAP para gestionar los E.R.P.s de grandes empresas. El motivo es que es un software cuyo diseño y funcionalidad es especialmente adecuado para la gestión ordenada de una cantidad grande de datos diversos de forma integrada. Diseñar e implementar el propio entorno es una tarea de enorme complejidad y que requerirá el esfuerzo de una cantidad importante de personas; por lo que este proyecto se ha limitado, a un prototipo básico con una serie de características mínimas; así como a indicar y dejar preparado el camino por el que deberán transcurrir las futuras agregaciones de funcionalidad o mejoras. Funcionalmente, esto es, independientemente de la implementación específica con la que se construya el entorno de simulación, se ha considerado dividir las características y prestaciones ofrecidas por el sistema en bloques. Estos bloques agruparán los componentes relacionados con un aspecto específico de la simulación, por ejemplo, el bloque 1, es el asignado a todo lo relacionado con el blanco a detectar. El usuario del entorno de simulación interactuará con el sistema ejecutando lo que se llaman transacciones, que son agrupaciones lógicas de datos a introducir/consultar en el sistema relacionados y que se pueden ejecutar de forma independiente. Un ejemplo de transacción es la que permite mantener una trayectoria de un blanco junto con sus parámetros, pero también puede ser una transacción la aplicación que permite por ejemplo, gestionar los usuarios con acceso al entorno. Es decir, las transacciones son el componente mínimo a partir del cual el usuario puede interactuar con el sistema. La interfaz gráfica que se le ofrecerá al usuario, está basada en modos, que se pueden considerar “ventanas” independientes entre sí dentro de las cuáles el usuario ejecuta sus transacciones. El usuario podrá trabajar con cuantos modos en paralelo desee y cambiar según desee entre ellos. La programación del software se ha realizado utilizando la metodología de orientación a objetos y se ha intentado maximizar la reutilización del código así como la configurabilidad de su funcionalidad. Una característica importante que se ha incorporado para garantizar la integridad de los datos es un diccionario sintáctico. Para permitir la persistencia de los datos entre sesiones del usuario se ha implementado una base de datos virtual (que se prevé se reemplace por una real), que permite manejar, tablas, campos clave, etc. con el fin de guardar todos los datos del entorno, tanto los de configuración que solo serían responsabilidad de los administradores/desarrolladores como los datos maestros y transaccionales que serían gestionados por los usuarios finales del entorno de simulación. ABSTRACT. This end-of-degree project comprises the design, study and implementation of a software based application able to simulate the various aspects and performance of a radar system. A blueprint for this application has been constructed upon the Matlab programming language. This is due to the fact that initially it was thought to be the one most suitable to the complex signals radar systems usually process; but it has proven to be less than adequate for all the other core processes the simulation environment must provide users with. The software’s design has been based on another existing software which is the one developed by the SAP company for managing enterprises, a software categorized (and considered the paradigm of) as E.R.P. software (E.R.P. stands for Enterprise Resource Planning). This software has been selected as a model because is very well suited (its basic features) for working in an orderly fashion with a pretty good quantity of data of very diverse characteristics, and for doing it in a way which protects the integrity of the data. To design and construct the simulation environment with all its potential features is a pretty hard task and requires a great amount of effort and work to be dedicated to its accomplishment. Due to this, the scope of this end-of-degree project has been focused to design and construct a very basic prototype with minimal features, but which way future developments and upgrades to the systems features should go has also been pointed. In a purely functional approach, i.e. disregarding completely the specific implementation which accomplishes the simulation features, the different parts or aspects of the simulation system have been divided and classified into blocks. The blocks will gather together and comprise the various components related with a specific aspect of the simulation landscape, for example, block number one will be the one dealing with all the features related to the radars system target. The user interaction with the system will be based on the execution of so called transactions, which essentially consist on bunches of information which logically belong together and can thus be managed together. A good example, could be a transaction which permits to maintain a series of specifications for target’s paths; but it could also be something completely unrelated with the radar system itself as for example, the management of the users who can access the system. Transactions will be thus the minimum unit of interaction of users with the system. The graphic interface provided to the user will be mode based, which can be considered something akin to a set of independent windows which are able on their own to sustain the execution of an independent transaction. The user ideally should be able to work with as many modes simultaneously as he wants to, switching his focus between them at will. The approach to the software construction has been based on the object based paradigm. An effort has been made to maximize the code’s reutilization and also in maximizing its customizing, i.e., same sets of code able to perform different tasks based on configuration data. An important feature incorporated to the software has been a data dictionary (a syntactic one) which helps guarantee data integrity. Another important feature that allow to maintain data persistency between user sessions, is a virtual relational data base (which should in future times become a real data base) which allows to store data in tables. The data store in this tables comprises both the system’s configuration data (which administrators and developers will maintain) and also master and transactional data whose maintenance will be the end users task.
Resumo:
Conservation tillage and crop rotation have spread during the last decades because promotes several positive effects (increase of soil organic content, reduction of soil erosion, and enhancement of carbon sequestration) (Six et al., 2004). However, these benefits could be partly counterbalanced by negative effects on the release of nitrous oxide (N2O) (Linn and Doran, 1984). There is a lack of data on long-term tillage system study, particularly in Mediterranean agro-ecosystems. The aim of this study was to evaluate the effects of long-term (>17 year) tillage systems (no tillage (NT), minimum tillage (MT) and conventional tillage (CT)); and crop rotation (wheat (W)-vetch (V)-barley (B)) versus wheat monoculture (M) on N2O emissions. Additionally, Yield-scaled N2O emissions (YSNE) and N uptake efficiency (NUpE) were assessed for each treatment.
Resumo:
Nitrous oxide (N2O) is the main greenhouse gas (GHG) produced by agricultural soils due to microbial processes. The application of N fertilizers is associated with an increase of N2O losses. However, it is possible to mitigate these emissions by the introduction of adequate management practices (Snyder et al., 2009). Soil conservation practices (i.e.no tillage, NT) have recently become widespread because they promote several positive effects (increases in soil organic carbonand soil fertility, reduction of soil erosion, etc). In terms of GHG emissions, there is no consensus in the literature on the effects of tillage on N2O. Several studies found that NT can produce greater (Baggs et al., 2003), lower (Malhi et al., 2006) or similar (Grandey et al., 2006) N2O emissions compared to traditional tillage (TT). This large uncertainty is associated with the duration of tillage practices and climatic variability. Liming is widely use to solve problems of soil acidity (Al toxicity, yield penalties, etc). Several studies show a decrease in N2O emissions with liming (Barton et al., 2013) whereas no significant effects or increases were observed in others (Galbally et al., 2010). The aim of this work was to evaluate the effects of tillage (NT vs TT) and liming application or not of Ca-amendment) on N2O emissions from an acid soil during a rainfed crop.
Resumo:
The DNDC (DeNitrification and DeComposition) model was first developed by Li et al. (1992) as a rain event-driven process-orientated simulation model for nitrous oxide, carbon dioxide and nitrogen gas emissions from the agricultural soils in the U.S. Over the last 20 years, the model has been modified and adapted by various research groups around the world to suit specific purposes and circumstances. The Global Research Alliance Modelling Platform (GRAMP) is a UK-led initiative for the establishment of a purposeful and credible web-based platform initially aimed at users of the DNDC model. With the aim of improving the predictions of soil C and N cycling in the context of climate change the objectives of GRAMP are to: 1) to document the existing versions of the DNDC model; 2) to create a family tree of the individual DNDC versions; 3) to provide information on model use and development; and 4) to identify strengths, weaknesses and potential improvements for the model.
Resumo:
The micrometeorological mass-balance integrated horizontal flux (IHF) technique has been commonly employed for measuring ammonia (NH3) emissions inon-field experiments. However, the inverse-dispersion modeling technique, such as the backward Lagrangian stochastic (bLS) modeling approach, is currently highlighted as offering flexibility in plot design and requiring a minimum number of samplers (Ro et al., 2013). The objective of this study was to make a comparison between the bLS technique with the IHF technique for estimating NH3 emission from flexible bag storage and following landspreading of dairy cattle slurry. Moreover, considering that NH3 emission in storage could have been non uniform, the effect on bLS estimates of a single point and multiple downwind concentration measurements was tested, as proposed by Sanz et al. (2010).
Resumo:
Accumulation of large volumes of dilute slurries is considered one of the major problems related to intensive farming (Sommer et al., 2004). In the EU-27, more than half of the total N excretion is applied to croplands due to technical advantages for farmers (e.g. reuse of nutrients). However, the N use efficiency of slurries produced by livestock is low, i.e. only 20-52% of the excreted N is recovered by crops. Much of the remainder can be lost into the atmosphere as ammonia (NH3), nitrous oxide (N2O), dinitrogen (N2) and nitrogen oxides (NOx).
Resumo:
Adjusting N fertilizer application to crop requirements is a key issue to improve fertilizer efficiency, reducing unnecessary input costs to farmers and N environmental impact. Among the multiple soil and crop tests developed, optical sensors that detect crop N nutritional status may have a large potential to adjust N fertilizer recommendation (Samborski et al. 2009). Optical readings are rapid to take and non-destructive, they can be efficiently processed and combined to obtain indexes or indicators of crop status. However, other physiological stress conditions may interfere with the readings and detection of the best crop nutritional status indicators is not always and easy task. Comparison of different equipments and technologies might help to identify strengths and weakness of the application of optical sensors for N fertilizer recommendation. The aim of this study was to evaluate the potential of various ground-level optical sensors and narrow-band indices obtained from airborne hyperspectral images as tools for maize N fertilizer recommendations. Specific objectives were i) to determine which indices could detect differences in maize plants treated with different N fertilizer rates, and ii) to evaluate its ability to identify N-responsive from non-responsive sites.
Resumo:
The CENTURY soil organic matter model was adapted for the DSSAT (Decision Support System for Agrotechnology Transfer), modular format in order to better simulate the dynamics of soil organic nutrient processes (Gijsman et al., 2002). The CENTURY model divides the soil organic carbon (SOC) into three hypothetical pools: microbial or active material (SOC1), intermediate (SOC2) and the largely inert and stable material (SOC3) (Jones et al., 2003). At the beginning of the simulation, CENTURY model needs a value of SOC3 per soil layer which can be estimated by the model (based on soil texture and management history) or given as an input. Then, the model assigns about 5% and 95% of the remaining SOC to SOC1 and SOC2, respectively. The model performance when simulating SOC and nitrogen (N) dynamics strongly depends on the initialization process. The common methods (e.g. Basso et al., 2011) to initialize SOC pools deal mostly with carbon (C) mineralization processes and less with N. Dynamics of SOM, SOC, and soil organic N are linked in the CENTURY-DSSAT model through the C/N ratio of decomposing material that determines either mineralization or immobilization of N (Gijsman et al., 2002). The aim of this study was to evaluate an alternative method to initialize the SOC pools in the DSSAT-CENTURY model from apparent soil N mineralization (Napmin) field measurements by using automatic inverse calibration (simulated annealing). The results were compared with the ones obtained by the iterative initialization procedure developed by Basso et al., 2011.
Resumo:
In order to establish rational nitrogen (N) application and reduce groundwater contamination, a clearer understanding of the N distribution through the growing season and its balance is crucial. Excessive doses of N and/or water applied to fertigated crops involve a substantial risk of aquifer contamination by nitrate; but knowledge of N cycling and availability within the soil could assist in avoiding this excess. In central Spain, the main horticultural fertigated crop is the melon type ?piel de sapo¿ and it is cultivated in vulnerable zones to nitrate pollution (Directive 91/676/CEE). However, until few years ago there were not antecedents related to the optimization of nitrogen fertilization together with irrigation. Water and N footprint are indicators that allow assessing the impact generated by different agricultural practices, so they can be used to improve the management strategies in fertigated crop systems. The water footprint distinguishes between blue water (sources of water applied to the crop, like irrigation and precipitation), green water (water used by the crop and stored in the soil), and it is furthermore possible to quantify the impact of pollution by calculating the grey water, which is defined as the volume of polluted water created from the growing and production of crops. On the other hand, the N footprint considers green N (nitrogen consumed by the crops and stored in the soil), blue N (N available for crop, like N applied with mineral and/or organic fertilizers, N applied with irrigation water and N mineralized during the crop period), whereas grey N is the amount of N-NO3- washed from the soil to the aquifer. All these components are expressed as the ratio between the components of water or N footprint and the yield (m3 t-1 or kg N t-1 respectively). The objetives of this work were to evaluate the impact derivated from the use of different fertilizer practices in a melon crop using water and N footprint.
Resumo:
La edificación residencial existente en España y en Europa se encuentra abocada a una rehabilitación profunda para cumplir los objetivos marcados en la estrategia europea para el año 2050. Estos, para el sector de la edificación, se proponen una reducción del 90% de emisiones de gases de efecto invernadero (GEI) respecto a niveles del año 1990. Este plan a largo plazo establece hitos intermedios de control, con objetivos parciales para el año 2020 y 2030. El objetivo último es aprovechar el potencial de reducción de demanda energética del sector de la edificación, del cual la edificación residencial supone el 85% en España. Dentro de estos requerimientos, de reducción de demanda energética en la edificación, la ventilación en la edificación residencial se convierte en uno de los retos a resolver por su vinculación directa a la salud y el confort de los ocupantes de la misma, y al mismo tiempo su relación proporcional con la demanda energética que presenta el edificio asociada al acondicionamiento térmico. Gran parte de las pérdidas térmicas de la edificación residencial se producen por el aire de renovación y la infiltración de aire a través de la envolvente. La directiva europea de eficiencia energética de la edificación (EPBD), que establece las directrices necesarias para alcanzar los objetivos de este sector en cuanto a emisiones de CO2 y gases de efecto invernadero (GEI), contempla la ventilación con aire limpio como un requisito fundamental a tener en cuenta de cara a las nuevas construcciones y a la rehabilitación energética de los edificios existentes. El síndrome del edificio enfermo, un conjunto de molestias y síntomas asociados a la baja calidad del aire de edificios no residenciales que surgió a raíz de la crisis del petróleo de 1973, tuvo su origen en una ventilación deficiente y una renovación del aire interior insuficiente de estos edificios, producto del intento de ahorro en la factura energética. Teniendo en cuenta que, de media, pasamos un 58% de nuestro tiempo en las viviendas, es fundamental cuidar la calidad del aire interior y no empeorarla aplicando medidas de “eficiencia energética” con efectos no esperados. Para conseguir esto es fundamental conocer en profundidad cómo se produce la ventilación en la edificación en bloque en España en sus aspectos de calidad del aire interior y demanda energética asociada a la ventilación. El objetivo de esta tesis es establecer una metodología de caracterización y de optimización de las necesidades de ventilación para los espacios residenciales existentes en España que aúne el doble objetivo de garantizar la calidad ambiental y reducir la demanda energética de los mismos. La caracterización del parque edificatorio residencial español en cuanto a ventilación es concluyente: La vivienda en España se distribuye principalmente en tres periodos en los que se encuentran más del 80% del total de las viviendas construidas. El periodo anterior a las normas básicas de la edificación (NBE), de 1960 a 1980, el periodo desde 1980 al año 2005, con el mayor número total de viviendas construidas, guiado por la NTE ISV 75, y el periodo correspondiente a la edificación construida a partir del Código Técnico de la Edificación, en 2006, cuyo documento básico de condiciones de salubridad (DB HS3) es la primera norma de obligado cumplimiento en diseño y dimensionamiento de ventilación residencial en España. La selección de un modelo de bloque de viviendas de referencia, un valor medio y representativo, seleccionado de entre estos periodos, pero con cualidades que se extienden más allá de uno de ellos, nos permite realizar un intensivo análisis comparativo de las condiciones de calidad de aire interior y la demanda energética del mismo, aplicando las distintas configuraciones que presenta la ventilación en viviendas dependiendo del escenario o época constructiva (o normativa) en que esta fuera construida. Este análisis se lleva a cabo apoyándose en un doble enfoque: el modelado numérico de simulaciones y el análisis de datos experimentales, para comprobar y afinar los modelos y observar la situación real de las viviendas en estos dos aspectos. Gracias a las conclusiones del análisis previo, se define una estrategia de optimización de la ventilación basada fundamentalmente en dos medidas: 1) La introducción de un sistema de extracción mecánica y recuperación de calor que permita reducir la demanda energética debida a la renovación del aire y a la vez diluir los contaminantes interiores más eficazmente para mejorar, de esta forma, la calidad del ambiente interior. 2) La racionalización del horario de utilización de estos sistemas, no malgastando la energía en periodos de no ocupación, permitiendo una leve ventilación de fondo, debida a la infiltración, que no incida en pérdidas energéticas cuantiosas. A esta optimización, además de aplicar la metodología de análisis previo, en cuanto a demanda energética y calidad del aire, se aplica una valoración económica integradora y comparativa basada en el reglamento delegado EU244/2012 de coste óptimo (Cost Optimal Methodology). Los resultados principales de esta tesis son: • Un diagnóstico de la calidad del aire interior de la edificación residencial en España y su demanda energética asociada, imprescindible para lograr una rehabilitación energética profunda garantizando la calidad del aire interior. • Un indicador de la relación directa entre calidad de aire y demanda energética, para evaluar la adecuación de los sistemas de ventilación, respecto de las nuevas normativas de eficiencia energética y ventilación. • Una estrategia de optimización, que ofrece una alternativa de intervención, y la aplicación de un método de valoración que permite evaluar la amortización comparada de la instalación de los sistemas. ABSTRACT The housing building stock already built in Spain and Europe faces a deep renovation in the present and near future to accomplish with the objectives agreed in the European strategy for 2050. These objectives, for the building sector, are set in a 90% of Green House Gases (GHG) reduction compared to levels in 1990. This long‐term plan has set milestones to control the correct advance of achievement in 2020 and 2030. The main objective is to take advantage of the great potential to reduce energy demand from the building sector, in which housing represents 85% share in Spain. Among this reduction on building energy demand requirements, ventilation of dwellings becomes one of the challenges to solve as it’s directly connected to the indoor air quality (IAQ) and comfort conditions for the users, as well as proportional to the building energy demand on thermal conditioning. A big share of thermal losses in housing is caused by air renovation and infiltration through the envelope leaks. The European Directive on Building energy performance (EPBD), establishes the roots needed to reach the building sector objectives in terms of CO2 and GHG emissions. This directive sets the ventilation and renovation with clean air of the new and existing buildings as a fundamental requirement. The Sick Building Syndrome (SBS), an aggregation of symptoms and annoys associated to low air quality in non residential buildings, appeared as common after the 1973 oil crisis. It is originated in defective ventilation systems and deficient air renovation rates, as a consequence of trying to lower the energy bill. Accounting that we spend 58% of our time in dwellings, it becomes crucial to look after the indoor air quality and focus in not worsening it by applying “energy efficient” measures, with not expected side effects. To do so, it is primary to research in deep how the ventilation takes place in the housing blocks in Spain, in the aspects related to IAQ and ventilation energy demand. This thesis main objective is to establish a characterization and optimization methodology regarding the ventilation needs for existing housing in Spain, considering the twofold objective of guaranteeing the air quality as reducing the energy demand. The characterization of the existing housing building stock in Spain regarding ventilation is conclusive. More of 80% of the housing stock is distributed in 3 main periods: before the implementation of the firsts regulations on building comfort conditions (Normas Básicas de la Edificación), from 1960 to 1980; the period after the first recommendations on ventilation (NTE ISV 75) for housing were set, around 1980 until 2005 and; the period corresponding to the housing built after the existing mandatory regulation in terms of indoor sanity conditions and ventilation (Spanish Building Code, DB HS3) was set, in 2006. Selecting a representative blueprint of a housing block in Spain, which has medium characteristics not just within the 3 periods mention, but which qualities extent beyond the 3 of them, allows the next step, analyzing. This comparative and intense analyzing phase is focused on the air indoor conditions and the related energy demand, applying different configurations to the ventilation systems according to the different constructive or regulation period in which the building is built. This analysis is also twofold: 1) Numerical modeling with computer simulations and 2) experimental data collection from existing housing in real conditions to check and refine the models to be tested. Thanks to the analyzing phase conclusions, an optimization strategy on the ventilation of the housing stock is set, based on two actions to take: 1) To introduce a mechanical exhaust and intake ventilation system with heat recovery that allows reducing energy demand, as improves the capacity of the system to dilute the pollutant load. This way, the environmental quality is improved. 2) To optimize the schedule of the system use, avoids waste of energy in no occupancy periods, relying ventilation during this time in a light infiltration ventilation, intended not to become large and not causing extra energy losses. Apart from applying the previous analyzing methodology to the optimization strategy, regarding energy demand and air quality, a ROI valorization is performed, based on the cost optimal methodology (delegated regulation EU244/2012). The main results from the thesis are: • To obtain a through diagnose regarding air quality and energy demand for the existing housing stock in Spain, unavoidable to reach a energy deep retrofitting scheme with no air quality worsening. • To obtain a marker to relate air quality and energy demand and evaluate adequateness of ventilation systems, for the new regulations to come. • To establish an optimization strategy to improve both air quality and energy demand, applying a compared valorization methodology to obtain the Return On Investment (ROI).
Resumo:
The rise and growth of large Jewish law firms in New York City during the second half of the twentieth century was nothing short of an astounding success story. As late as 1950, there was not a single large Jewish law firm in town. By the mid-1960s, six of the largest twenty law firms were Jewish, and by 1980, four of the largest ten prestigious law firms were Jewish firms. Moreover, the accomplishment of the Jewish firms is especially striking because, while the traditional large White Anglo-Saxon Protestant law firms grew at a fast rate during this period, the Jewish firms grew twice as fast, and they did so in spite of experiencing explicit discrimination. What happened? This book chapter is a revised, updated study of the rise and growth of large New York City Jewish law firms. It is based on the public record, with respect to both the law firms themselves and trends in the legal profession generally, and on over twenty in-depth interviews with lawyers who either founded and practiced at these successful Jewish firms, attempted and failed to establish such firms, or were in a position to join these firms but decided instead to join WASP firms. According to the informants interviewed in this chapter, while Jewish law firms benefited from general decline in anti-Semitism and increased demand for corporate legal services, a unique combination of factors explains the incredible rise of the Jewish firms. First, white-shoe ethos caused large WASP firms to stay out of undignified practice areas and effectively created pockets of Jewish practice areas, where the Jewish firms encountered little competition for their services. Second, hiring and promotion discriminatory practices by the large WASP firms helped create a large pool of talented Jewish lawyers from which the Jewish firms could easily recruit. Finally, the Jewish firms benefited from a flip side of bias phenomenon, that is, they benefited from the positive consequences of stereotyping. Paradoxically, the very success of the Jewish firms is reflected in their demise by the early twenty-first century: because systematic large law firm ethno-religious discrimination against Jewish lawyers has become a thing of the past, the very reason for the existence of Jewish law firms has been nullified. As other minority groups, however, continue to struggle for equality within the senior ranks of Big Law, can the experience of the Jewish firms serve as a “separate-but-equal” blueprint for overcoming contemporary forms of discrimination for women, racial, and other minority attorneys? Perhaps not. As this chapter establishes, the success of large Jewish law firms was the result of unique conditions and circumstances between 1945 and 1980, which are unlikely to be replicated. For example, large law firms have become hyper-competitive and are not likely to allow any newcomers the benefit of protected pockets of practice. While smaller “separate-but-equal” specialized firms, for instance, ones exclusively hiring lawyer-mothers occasionally appear, the rise of large “separate-but-equal” firms is improbable.
Resumo:
Sodium phosphates are a class of chemicals that have been widely employed in commercial and consumer applications. However, declining use of these chemicals due to environmental concerns has lead to restructuring within the industry that has caused, and is likely to continue to cause, reduction of sodium phosphate production capacity. Closure of a sodium phosphate manufacturing plant necessitates decommissioning and decontamination activities that are subject to a variety of federal, state, and local regulations. A compliance plan was developed to provide a blueprint for ensuring that all federal regulatory requirements are met, however, site dependent state and local requirements were excluded. The compliance plan provides a framework that addresses project team formation and project planning, regulatory requirements, identification of affected processing equipment, plant pre-shutdown activities, waste stream identification and waste management facilities, safety, training, and emergency preparedness planning, and project decommissioning remedial actions. This regulatory compliance plan will enable sodium phosphate plant operators to complete decontamination and decommissioning work in a timely, efficient, compliant, and cost effective manner.
Resumo:
When the act of 'drawing' became what can only be called formalised, (whose growth can be said to have blossomed during the Renaissance), there developed a separation between the drawing and its procurement. Recently, David Ross Scheer, in his book ‘The Death of Drawing, Architecture in the Age of Simulation’ wrote: ‘…whereas architectural drawings exist to represent construction, architectural simulations exist to anticipate building performance.’ Meanwhile, Paolo Belardi, in his work ‘Why Architects Still Draw’ likens a drawing to an acorn, where he says: ‘It is the paradox of the acorn: a project emerges from a drawing – even from a sketch, rough and inchoate - just as an oak tree emerges from an acorn.’ He tells us that Giorgio Vasari would work late at night ‘seeking to solve the problems of perspective’ and he makes a passionate plea that this reflective process allows the concept to evolve, grow and/or develop. However, without belittling Belardi, the virtual model now needs this self-same treatment where it is nurtured, coaxed and encouraged to be the inchoate blueprint of the resultant oak tree. The model now too can embrace the creative process going through the first phase of preparation, where it focuses on the problem. The manipulation of the available material can then be incubated so that it is reasoned and generates feedback. This paper serves to align this shift in perception, methodologies and assess whether the 2D paper abstraction still has a purpose and role in today’s digital world!
Resumo:
Second copy of the previous blueprint. Also includes a 1934 letter from Edward Dana of the Boston Elevated Railway Public Trustees to Samuel Eliot Morison, Editor of The Tercentennial History of Harvard University responding to Morison's request for additional information about discovery of early Harvard building foundations during the subway construction excavation in Harvard Square.
Resumo:
This layer is a georeferenced raster image of the historic paper map entitled: City of Los Angeles, [by] Homer Hamlin, city engineer. Blueprint published in 1908. Scale [ca. 1:19,000]. The image inside the map neatline is georeferenced to the surface of the earth and fit to the California State Plane Zone V Coordinate System NAD83 (in Feet) (Fipszone 0405). All map collar and inset information is also available as part of the raster image, including any inset maps, profiles, statistical tables, directories, text, illustrations, index maps, legends, or other information associated with the principal map. This map shows features such as roads, drainage, parks, cemeteries, selected public buildings, and more. This layer is part of a selection of digitally scanned and georeferenced historic maps from The Harvard Map Collection as part of the Imaging the Urban Environment project. Maps selected for this project represent major urban areas and cities of the world, at various time periods. These maps typically portray both natural and manmade features at a large scale. The selection represents a range of regions, originators, ground condition dates, scales, and purposes.