965 resultados para Traditional enrichment method
Resumo:
As distorções estáticas são um grave problema que afeta o método magnetotelúrico e muitas tentativas têm sido feitas para eliminar ou minimizar os seus efeitos. Este trabalho trata de uma técnica nova que trata deste problema, o método EMAP, que é uma adaptação do método magnetotelúrico na qual as medidas de campo elétrico são feitas em uma linha contínua de dipolos conectados entre si, e os dados coletados desta maneira são tratados com um filtro espacial passa-baixa dependente da frequência. Este trabalho é composto de duas partes principais, a simulação numérica dos dados contaminados com as distorções estáticas e a filtragem destes dados com o filtro espacial passa-baixa. Na primeira parte, aplicamos o método dos elementos finitos para simular a resposta dos dipolos elétricos, os quais fazem as medidas do campo elétrico. Na segunda parte aplicamos a janela de Hanning como filtro passa-baixa apropriado para tratar os dados.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Odontologia Restauradora - ICT
Resumo:
Dissatisfaction with certain aspects of the educational processes of the traditional teaching has always existed, and new teaching methods have been routinely studied. The experimental investigative activity is one of those alternative practices. In this type of activity the experimentation is inserted with an investigative approach, in which the student must build the concept, with proposals that represent solutions to the excited problems. In the teaching of chemistry, specifically, the need and importance of experimentation is evident, beyond motivate students, aid in the understanding of chemical concepts relating them to reality. Realizing the contributions of this methodology for teaching and learning, through this research was conducted to understand the difficulties encountered by teachers for planning and implementation of these activities in the teaching of chemistry and therefore the reasons for the dominance of traditional teaching method. The subjects were undergraduate students of chemistry course that developed and implemented differentiated learning activities for teaching and teachers who accompanied the high school students who participated in the university extension project Inclusion Science and University students and teachers from public: Teaching and Learning Chemistry focuses on research and practice”. Through the data it was possible to identify some factors that affect and hinder the implementation of experimental activities in general, not only the investigative. However, despite the difficulties experienced by undergraduates, the majority considered the activity as an alternative teaching method interesting and innovative, able to produce interest, motivation and participation of students with subsequent learning. As well as the teachers, what with all the difficulties that they had declared facing when applying experimental activities, they admitted the pedagogical... (Complete abstract click electronic access belo)
Resumo:
Pós-graduação em Agronomia (Genética e Melhoramento de Plantas) - FCAV
Resumo:
A new type of pavement has been gaining popularity over the last few years in Europe. It comprises a surface course with a semi-flexible material that provides significant advantages in comparison to both concrete and conventional asphalt, having both rut resistance and a degree of flexibility. It also provides good protection against the ingress of water to the foundation, since it has an impermeable surface. The semi-flexible material, generally known as grouted macadam, comprises an open-graded asphalt skeleton with 25% to 35% voids into which a cementitious slurry is grouted. This hybrid mixture provides good rut resistance and a surface highly resistant to fuel and oil spillage. Such properties allow it to be used in industrial areas, airports and harbours, where those situations are frequently associated with heavy and slow traffic. Grouted Macadams constitute a poorly understood branch of pavement technology and have generally been relegated to a role in certain specialist pavements whose performance is predicted on purely empirical evidence. Therefore, the main objectives of this project were related to better understanding the properties of this type of material, in order to predict its performance more realistically and to design pavements incorporating grouted macadam more accurately. Based on a standard mix design, several variables were studied during this project in order to characterise the behaviour of Grouted Macadams in general, and the influence of those variables on the fundamental properties of the final mixture. In this research project, one approach was used to the design of pavements incorporating Grouted Macadams: a traditional design method, based on laboratory determined of the stiffness modulus and the compressive strength.
Resumo:
Questa tesi descrive lo sviluppo di un elettrodo modificato con un polimero isolante per la determinazione indiretta del radicale OH. I polimeri testati sono stati polifenolo, polipirrolo e polipirrolo sovraoossidato ed il primo è risultato quello con le migliori prestazioni. Il film di modificante è stato depositato per elettropolimerizzazione del fenolo in ambiente acido, su un elettrodo di carbone vetroso (GC) ed è risultato isolante e perfettamente adeso al GC, impedendo il trasferimento di carica alle più comuni sonde redox. L’attacco dei radicali OH, generati dalla reazione di Fenton o dalla fotolisi di H2O2, rimuove parzialmente il polimero dal GC, ripristinando parzialmente il comportamento conduttore dell’elettrodo. L’entità della degradazione del film polifenolico è stata valutata sfruttando la corrente relativa alla sonda redox Ru(NH3)63+, che rappresenta il segnale analitico per la determinazione del radicale OH. L’elettrodo è stato impiegato per stimare le prestazioni di foto catalizzatori a base di nanoparticelle di TiO2, ottenendo risultati correlati a quelli ricavati da un metodo HPLC. Inoltre esso è stato usato per sviluppare una nuova procedura per la determinazione della capacità di scavenging verso i radicali OH, che è stata applicata all’analisi di composti puri e campioni reali. I risultati erano confrontabili con quelli determinati con metodiche standardizzate, comunemente impiegate per la determinazione della capacità antiossidante. Inoltre è stato condotto uno studio riguardante la modifica di un elettrodo di platino con un idrossido misto a strati a base di cobalto e alluminio (LDH). In particolare si sono valutati gli effetti di diversi pretrattamenti del Pt sulle caratteristiche e prestazioni elettrocatalitiche del film di LDH nei confronti dell’ossidazione di anilina, fenolo e acido salicilico. Questi composti possono essere impiegati come molecole sonda per la determinazione del radicale OH e rivestono interesse da un punto di vista elettroanalitico perché portano facilmente alla passivazione della superficie di Pt.
Resumo:
As lightweight and slender structural elements are more frequently used in the design, large scale structures become more flexible and susceptible to excessive vibrations. To ensure the functionality of the structure, dynamic properties of the occupied structure need to be estimated during the design phase. Traditional analysis method models occupants simply as an additional mass; however, research has shown that human occupants could be better modeled as an additional degree-of- freedom. In the United Kingdom, active and passive crowd models are proposed by the Joint Working Group as a result of a series of analytical and experimental research. It is expected that the crowd models would yield a more accurate estimation to the dynamic response of the occupied structure. However, experimental testing recently conducted through a graduate student project at Bucknell University indicated that the proposed passive crowd model might be inaccurate in representing the impact on the structure from the occupants. The objective of this study is to provide an assessment of the validity of the crowd models proposed by JWG through comparing the dynamic properties obtained from experimental testing data and analytical modeling results. The experimental data used in this study was collected by Firman in 2010. The analytical results were obtained by performing a time-history analysis on a finite element model of the occupied structure. The crowd models were created based on the recommendations from the JWG combined with the physical properties of the occupants during the experimental study. During this study, SAP2000 was used to create the finite element models and to implement the analysis; Matlab and ME¿scope were used to obtain the dynamic properties of the structure through processing the time-history analysis results from SAP2000. The result of this study indicates that the active crowd model could quite accurately represent the impact on the structure from occupants standing with bent knees while the passive crowd model could not properly simulate the dynamic response of the structure when occupants were standing straight or sitting on the structure. Future work related to this study involves improving the passive crowd model and evaluating the crowd models with full-scale structure models and operating data.
DIMENSION REDUCTION FOR POWER SYSTEM MODELING USING PCA METHODS CONSIDERING INCOMPLETE DATA READINGS
Resumo:
Principal Component Analysis (PCA) is a popular method for dimension reduction that can be used in many fields including data compression, image processing, exploratory data analysis, etc. However, traditional PCA method has several drawbacks, since the traditional PCA method is not efficient for dealing with high dimensional data and cannot be effectively applied to compute accurate enough principal components when handling relatively large portion of missing data. In this report, we propose to use EM-PCA method for dimension reduction of power system measurement with missing data, and provide a comparative study of traditional PCA and EM-PCA methods. Our extensive experimental results show that EM-PCA method is more effective and more accurate for dimension reduction of power system measurement data than traditional PCA method when dealing with large portion of missing data set.
Resumo:
Introduction: Could Livestream Video, an emerging audiovisual media, be used more effectively than the traditional demonstration method of teaching or modeling dental hygiene students on how to provide chairside patient education? [See PDF for complete abstract]
Resumo:
The purpose of this study was to conduct a descriptive, exploratory analysis of the utilization of both traditional healing methods and western biomedical approaches to health care among members of the Vietnamese community in Houston, Texas. The first goal of the study was to identify the type(s) of health care that the Vietnamese use. The second goal was to highlight the numerous factors that may influence why certain health care choices are made. The third goal of this study was to examine the issue of preference to determine which practices would be used if limiting factors did not exist. ^ There were 81 participants, consisting of males and females who were 18 years or older. The core groups of participants were Vietnamese students from the University of Houston-Downtown and volunteer staff members from VN TeamWork. Asking the students and staff members to recommend others for the study used the snowball method of recruiting additional participants. ^ Surveys and informed consents were in English and Vietnamese. The participants were given the choice to take the surveys face-to-face or on their own. Surveys consisted of structured questions with predetermined choices, as well as, open-ended questions to allow more detailed information. The quantitative and qualitative data were coded and entered into a database, using SPSS software version 15.0. ^ Results indicated that participants used both traditional (38.3%) and biomedical (59.3%) healing, with 44.4% stating that it depended on the illness as to treatment. Coining was the most used traditional healing method, clearly still used by all ages. Coining was also the method most used when issues regarding fear and delayed western medical treatment were involved. It was determined that insurance status, more than household income, guided health care choices. A person's age, number of years spent in the United States, age at migration, and the use of certain traditional healing methods like coining all played a role in the importance of the health care practitioner speaking Vietnamese. The most important finding was that 64.2% of the participants preferred both traditional and western medicine because both methods work. ^
Resumo:
Traditional comparison of standardized mortality ratios (SMRs) can be misleading if the age-specific mortality ratios are not homogeneous. For this reason, a regression model has been developed which incorporates the mortality ratio as a function of age. This model is then applied to mortality data from an occupational cohort study. The nature of the occupational data necessitates the investigation of mortality ratios which increase with age. These occupational data are used primarily to illustrate and develop the statistical methodology.^ The age-specific mortality ratio (MR) for the covariates of interest can be written as MR(,ij...m) = ((mu)(,ij...m)/(theta)(,ij...m)) = r(.)exp (Z('')(,ij...m)(beta)) where (mu)(,ij...m) and (theta)(,ij...m) denote the force of mortality in the study and chosen standard populations in the ij...m('th) stratum, respectively, r is the intercept, Z(,ij...m) is the vector of covariables associated with the i('th) age interval, and (beta) is a vector of regression coefficients associated with these covariables. A Newton-Raphson iterative procedure has been used for determining the maximum likelihood estimates of the regression coefficients.^ This model provides a statistical method for a logical and easily interpretable explanation of an occupational cohort mortality experience. Since it gives a reasonable fit to the mortality data, it can also be concluded that the model is fairly realistic. The traditional statistical method for the analysis of occupational cohort mortality data is to present a summary index such as the SMR under the assumption of constant (homogeneous) age-specific mortality ratios. Since the mortality ratios for occupational groups usually increase with age, the homogeneity assumption of the age-specific mortality ratios is often untenable. The traditional method of comparing SMRs under the homogeneity assumption is a special case of this model, without age as a covariate.^ This model also provides a statistical technique to evaluate the relative risk between two SMRs or a dose-response relationship among several SMRs. The model presented has application in the medical, demographic and epidemiologic areas. The methods developed in this thesis are suitable for future analyses of mortality or morbidity data when the age-specific mortality/morbidity experience is a function of age or when there is an interaction effect between confounding variables needs to be evaluated. ^
Resumo:
Actualmente son una práctica común los procesos de normalización de métodos de ensayo y acreditación de laboratorios, ya que permiten una evaluación de los procedimientos llevados a cabo por profesionales de un sector tecnológico y además permiten asegurar unos mínimos de calidad en los resultados finales. En el caso de los laboratorios de acústica, para conseguir y mantener la acreditación de un laboratorio es necesario participar activamente en ejercicios de intercomparación, utilizados para asegurar la calidad de los métodos empleados. El inconveniente de estos ensayos es el gran coste que suponen para los laboratorios, siendo en ocasiones inasumible por estos teniendo que renunciar a la acreditación. Este Proyecto Fin de Grado se centrará en el desarrollo de un Laboratorio Virtual implementado mediante una herramienta software que servirá para realizar ejercicios de intercomparación no presenciales, ampliando de ese modo el concepto e-comparison y abriendo las bases a que en un futuro este tipo de ejercicios no presenciales puedan llegar a sustituir a los llevados a cabo actualmente. En el informe primero se hará una pequeña introducción, donde se expondrá la evolución y la importancia de los procedimientos de calidad acústica en la sociedad actual. A continuación se comentará las normativas internacionales en las que se soportará el proyecto, la norma ISO 145-5, así como los métodos matemáticos utilizados en su implementación, los métodos estadísticos de propagación de incertidumbres especificados por la JCGM (Joint Committee for Guides in Metrology). Después, se hablará sobre la estructura del proyecto, tanto del tipo de programación utilizada en su desarrollo como la metodología de cálculo utilizada para conseguir que todas las funcionalidades requeridas en este tipo de ensayo estén correctamente implementadas. Posteriormente se llevará a cabo una validación estadística basada en la comparación de unos datos generados por el programa, procesados utilizando la simulación de Montecarlo, y unos cálculos analíticos, que permita comprobar que el programa funciona tal y como se ha previsto en la fase de estudio teórico. También se realizará una prueba del programa, similar a la que efectuaría un técnico de laboratorio, en la que se evaluará la incertidumbre de la medida calculándola mediante el método tradicional, pudiendo comparar los datos obtenidos con los que deberían obtenerse. Por último, se comentarán las conclusiones obtenidas con el desarrollo y pruebas del Laboratorio Virtual, y se propondrán nuevas líneas de investigación futuras relacionadas con el concepto e-comparison y la implementación de mejoras al Laboratorio Virtual. ABSTRACT. Nowadays it is common practise to make procedures to normalise trials methods standards and laboratory accreditations, as they allow for the evaluation of the procedures made by professionals from a particular technological sector in addition to ensuring a minimum quality in the results. In order for an acoustics laboratory to achieve and maintain the accreditation it is necessary to actively participate in the intercomparison exercises, since these are used to assure the quality of the methods used by the technicians. Unfortunately, the high cost of these trials is unaffordable for many laboratories, which then have to renounce to having the accreditation. This Final Project is focused on the development of a Virtual Laboratory implemented by a software tool that it will be used for making non-attendance intercomparison trials, widening the concept of e-comparison and opening the possibility for using this type of non-attendance trials instead of the current ones. First, as a short introduction, I show the evolution and the importance today of acoustic quality procedures. Second, I will discuss the international standards, such as ISO 145-5, as well the mathematic and statistical methods of uncertainty propagation specified by the Joint Committee for Guides in Metrology, that are used in the Project. Third, I speak about the structure of the Project, as well as the programming language structure and the methodology used to get the different features needed in this acoustic trial. Later, a statistical validation will be carried out, based on comparison of data generated by the program, processed using a Montecarlo simulation, and analytical calculations to verify that the program works as planned in the theoretical study. There will also be a test of the program, similar to one that a laboratory technician would carry out, by which the uncertainty in the measurement will be compared to a traditional calculation method so as to compare the results. Finally, the conclusions obtained with the development and testing of the Virtual Laboratory will be discussed, new research paths related to e-comparison definition and the improvements for the Laboratory will be proposed.
Resumo:
El objetivo de la presente investigación es el desarrollo de un modelo de cálculo rápido, eficiente y preciso, para la estimación de los costes finales de construcción, en las fases preliminares del proyecto arquitectónico. Se trata de una herramienta a utilizar durante el proceso de elaboración de estudios previos, anteproyecto y proyecto básico, no siendo por tanto preciso para calcular el “predimensionado de costes” disponer de la total definición grafica y literal del proyecto. Se parte de la hipótesis de que en la aplicación práctica del modelo no se producirán desviaciones superiores al 10 % sobre el coste final de la obra proyectada. Para ello se formulan en el modelo de predimensionado cinco niveles de estimación de costes, de menor a mayor definición conceptual y gráfica del proyecto arquitectónico. Los cinco niveles de cálculo son: dos que toman como referencia los valores “exógenos” de venta de las viviendas (promoción inicial y promoción básica) y tres basados en cálculos de costes “endógenos” de la obra proyectada (estudios previos, anteproyecto y proyecto básico). El primer nivel de estimación de carácter “exógeno” (nivel .1), se calcula en base a la valoración de mercado de la promoción inmobiliaria y a su porcentaje de repercusión de suelo sobre el valor de venta de las viviendas. El quinto nivel de valoración, también de carácter “exógeno” (nivel .5), se calcula a partir del contraste entre el valor externo básico de mercado, los costes de construcción y los gastos de promoción estimados de la obra proyectada. Este contraste entre la “repercusión del coste de construcción” y el valor de mercado, supone una innovación respecto a los modelos de predimensionado de costes existentes, como proceso metodológico de verificación y validación extrínseca, de la precisión y validez de las estimaciones resultantes de la aplicación práctica del modelo, que se denomina Pcr.5n (Predimensionado costes de referencia con .5niveles de cálculo según fase de definición proyectual / ideación arquitectónica). Los otros tres niveles de predimensionado de costes de construcción “endógenos”, se estiman mediante cálculos analíticos internos por unidades de obra y cálculos sintéticos por sistemas constructivos y espacios funcionales, lo que se lleva a cabo en las etapas iniciales del proyecto correspondientes a estudios previos (nivel .2), anteproyecto (nivel .3) y proyecto básico (nivel .4). Estos cálculos teóricos internos son finalmente evaluados y validados mediante la aplicación práctica del modelo en obras de edificación residencial, de las que se conocen sus costes reales de liquidación final de obra. Según va evolucionando y se incrementa el nivel de definición y desarrollo del proyecto, desde los estudios previos hasta el proyecto básico, el cálculo se va perfeccionando en su nivel de eficiencia y precisión de la estimación, según la metodología aplicada: [aproximaciones sucesivas en intervalos finitos], siendo la hipótesis básica como anteriormente se ha avanzado, lograr una desviación máxima de una décima parte en el cálculo estimativo del predimensionado del coste real de obra. El cálculo del coste de ejecución material de la obra, se desarrolla en base a parámetros cúbicos funcionales “tridimensionales” del espacio proyectado y parámetros métricos constructivos “bidimensionales” de la envolvente exterior de cubierta/fachada y de la huella del edificio sobre el terreno. Los costes funcionales y constructivos se ponderan en cada fase del proceso de cálculo con sus parámetros “temáticos/específicos” de gestión (Pg), proyecto (Pp) y ejecución (Pe) de la concreta obra presupuestada, para finalmente estimar el coste de construcción por contrata, como resultado de incrementar al coste de ejecución material el porcentaje correspondiente al parámetro temático/especifico de la obra proyectada. El modelo de predimensionado de costes de construcción Pcr.5n, será una herramienta de gran interés y utilidad en el ámbito profesional, para la estimación del coste correspondiente al Proyecto Básico previsto en el marco técnico y legal de aplicación. Según el Anejo I del Código Técnico de la Edificación (CTE), es de obligado cumplimiento que el proyecto básico contenga una “Valoración aproximada de la ejecución material de la obra proyectada por capítulos”, es decir , que el Proyecto Básico ha de contener al menos un “presupuesto aproximado”, por capítulos, oficios ó tecnologías. El referido cálculo aproximado del presupuesto en el Proyecto Básico, necesariamente se ha de realizar mediante la técnica del predimensionado de costes, dado que en esta fase del proyecto arquitectónico aún no se dispone de cálculos de estructura, planos de acondicionamiento e instalaciones, ni de la resolución constructiva de la envolvente, por cuanto no se han desarrollado las especificaciones propias del posterior proyecto de ejecución. Esta estimación aproximada del coste de la obra, es sencilla de calcular mediante la aplicación práctica del modelo desarrollado, y ello tanto para estudiantes como para profesionales del sector de la construcción. Como se contiene y justifica en el presente trabajo, la aplicación práctica del modelo para el cálculo de costes en las fases preliminares del proyecto, es rápida y certera, siendo de sencilla aplicación tanto en vivienda unifamiliar (aisladas y pareadas), como en viviendas colectivas (bloques y manzanas). También, el modelo es de aplicación en el ámbito de la valoración inmobiliaria, tasaciones, análisis de viabilidad económica de promociones inmobiliarias, estimación de costes de obras terminadas y en general, cuando no se dispone del proyecto de ejecución y sea preciso calcular los costes de construcción de las obras proyectadas. Además, el modelo puede ser de aplicación para el chequeo de presupuestos calculados por el método analítico tradicional (estado de mediciones pormenorizadas por sus precios unitarios y costes descompuestos), tanto en obras de iniciativa privada como en obras promovidas por las Administraciones Públicas. Por último, como líneas abiertas a futuras investigaciones, el modelo de “predimensionado costes de referencia 5 niveles de cálculo”, se podría adaptar y aplicar para otros usos y tipologías diferentes a la residencial, como edificios de equipamientos y dotaciones públicas, valoración de edificios históricos, obras de urbanización interior y exterior de parcela, proyectos de parques y jardines, etc….. Estas lineas de investigación suponen trabajos paralelos al aquí desarrollado, y que a modo de avance parcial se recogen en las comunicaciones presentadas en los Congresos internacionales Scieconf/Junio 2013, Rics‐Cobra/Septiembre 2013 y en el IV Congreso nacional de patología en la edificación‐Ucam/Abril 2014. ABSTRACT The aim of this research is to develop a fast, efficient and accurate calculation model to estimate the final costs of construction, during the preliminary stages of the architectural project. It is a tool to be used during the preliminary study process, drafting and basic project. It is not therefore necessary to have the exact, graphic definition of the project in order to be able to calculate the cost‐scaling. It is assumed that no deviation 10% higher than the final cost of the projected work will occur during the implementation. To that purpose five levels of cost estimation are formulated in the scaling model, from a lower to a higher conceptual and graphic definition of the architectural project. The five calculation levels are: two that take as point of reference the ”exogenous” values of house sales (initial development and basic development), and three based on calculation of endogenous costs (preliminary study, drafting and basic project). The first ”exogenous” estimation level (level.1) is calculated over the market valuation of real estate development and the proportion the cost of land has over the value of the houses. The fifth level of valuation, also an ”exogenous” one (level.5) is calculated from the contrast between the basic external market value, the construction costs, and the estimated development costs of the projected work. This contrast between the ”repercussions of construction costs” and the market value is an innovation regarding the existing cost‐scaling models, as a methodological process of extrinsic verification and validation, of the accuracy and validity of the estimations obtained from the implementation of the model, which is called Pcr.5n (reference cost‐scaling with .5calculation levels according to the stage of project definition/ architectural conceptualization) The other three levels of “endogenous” construction cost‐scaling are estimated from internal analytical calculations by project units and synthetic calculations by construction systems and functional spaces. This is performed during the initial stages of the project corresponding to preliminary study process (level.2), drafting (level.3) and basic project (level.4). These theoretical internal calculations are finally evaluated and validated via implementation of the model in residential buildings, whose real costs on final payment of the works are known. As the level of definition and development of the project evolves, from preliminary study to basic project, the calculation improves in its level of efficiency and estimation accuracy, following the applied methodology: [successive approximations at finite intervals]. The basic hypothesis as above has been made, achieving a maximum deviation of one tenth, in the estimated calculation of the true cost of predimensioning work. The cost calculation for material execution of the works is developed from functional “three‐dimensional” cubic parameters for the planned space and constructive “two dimensional” metric parameters for the surface that envelopes around the facade and the building’s footprint on the plot. The functional and building costs are analyzed at every stage of the process of calculation with “thematic/specific” parameters of management (Pg), project (Pp) and execution (Pe) of the estimated work in question, and finally the cost of contractual construction is estimated, as a consequence of increasing the cost of material execution with the percentage pertaining to the thematic/specific parameter of the projected work. The construction cost‐scaling Pcr.5n model will be a useful tool of great interest in the professional field to estimate the cost of the Basic Project as prescribed in the technical and legal framework of application. According to the appendix of the Technical Building Code (CTE), it is compulsory that the basic project contains an “approximate valuation of the material execution of the work, projected by chapters”, that is, that the basic project must contain at least an “approximate estimate” by chapter, trade or technology. This approximate estimate in the Basic Project is to be performed through the cost‐scaling technique, given that structural calculations, reconditioning plans and definitive contruction details of the envelope are still not available at this stage of the architectural project, insofar as specifications pertaining to the later project have not yet been developed. This approximate estimate of the cost of the works is easy to calculate through the implementation of the given model, both for students and professionals of the building sector. As explained and justified in this work, the implementation of the model for cost‐scaling during the preliminary stage is fast and accurate, as well as easy to apply both in single‐family houses (detached and semi‐detached) and collective housing (blocks). The model can also be applied in the field of the real‐estate valuation, official appraisal, analysis of the economic viability of real estate developments, estimate of the cost of finished projects and, generally, when an implementation project is not available and it is necessary to calculate the building costs of the projected works. The model can also be applied to check estimates calculated by the traditional analytical method (state of measurements broken down into price per unit cost details), both in private works and those promoted by Public Authorities. Finally, as potential lines for future research, the “five levels of calculation cost‐scaling model”, could be adapted and applied to purposes and typologies other than the residential one, such as service buildings and public facilities, valuation of historical buildings, interior and exterior development works, park and garden planning, etc… These lines of investigation are parallel to this one and, by way of a preview, can be found in the dissertations given in the International Congresses Scieconf/June 2013, Rics‐Cobra/September 2013 and in the IV Congress on building pathology ‐Ucam/April 2014.
Resumo:
Summary. On 11 March 2011, a devastating earthquake struck Japan and caused a major nuclear accident at the Fukushima Daiichi nuclear plant. The disaster confirmed that nuclear reactors must be protected even against accidents that have been assessed as highly unlikely. It also revealed a well-known catalogue of problems: faulty design, insufficient back-up systems, human error, inadequate contingency plans, and poor communications. The catastrophe triggered the rapid launch of a major re-examination of nuclear reactor security in Europe. It also stopped in its tracks what had appeared to be a ‘nuclear renaissance’, both in Europe and globally, especially in the emerging countries. Under the accumulated pressure of rising demand and climate warming, many new nuclear projects had been proposed. Since 2011 there has been more ambivalence, especially in Europe. Some Member States have even decided to abandon the nuclear sector altogether. This Egmont Paper aims to examine the reactions of the EU regarding nuclear safety since 2011. Firstly, a general description of the nuclear sector in Europe is provided. The nuclear production of electricity currently employs around 500,000 people, including those working in the supply chain. It generates approximately €70 billion per year. It provides roughly 30% of the electricity consumed in the EU. At the end of 2013, there were 131 nuclear power reactors active in the EU, located in 14 countries. Four new reactors are under construction in France, Slovakia and Finland. Secondly, this paper will present the Euratom legal framework regarding nuclear safety. The European Atomic Energy Community (EAEC or Euratom) Treaty was signed in 1957, and somewhat obscured by the European Economic Community (EEC) Treaty. It was a more classical treaty, establishing institutions with limited powers. Its development remained relatively modest until the Chernobyl catastrophe, which provoked many initiatives. The most important was the final adoption of the Nuclear Safety Directive 2009/71. Thirdly, the general symbiosis between Euratom and the International Atomic Energy Agency (IAEA) will be explained. Fourthly, the paper analyses the initiatives taken by the EU in the wake of the Fukushima catastrophe. These initiatives are centred around the famous ‘stress tests’. Fifthly, the most important legal change brought about by this event was the revision of Directive 2009/71. Directive 2014/87 has been adopted quite rapidly, and has deepened in various ways the role of the EU in nuclear safety. It has reinforced the role and effective independence of the national regulatory authorities. It has enhanced transparency on nuclear safety matters. It has strengthened principles, and introduced new general nuclear safety objectives and requirements, addressing specific technical issues across the entire life cycle of nuclear installations, and in particular, nuclear power plants. It has extended monitoring and the exchange of experiences by establishing a European system of peer reviews. Finally, it has established a mechanism for developing EU-wide harmonized nuclear safety guidelines. In spite of these various improvements, Directive 2014/87 Euratom still reflects the ambiguity of the Euratom system in general, and especially in the field of nuclear safety. The use of nuclear energy remains controversial among Member States. Some of them remain adamantly in favour, others against or ambivalent. The intervention of the EAEC institutions remains sensitive. The use of the traditional Community method remains limited. The peer review method remains a very peculiar mechanism that deserves more attention.