924 resultados para Enviromental impact analysis
Resumo:
Nitrogen (N) and phosphorus (P) are essential elements for all living organisms. However, in excess, they contribute to several environmental problems such as aquatic and terrestrial eutrophication. Globally, human action has multiplied the volume of N and P cycling since the onset of industrialization. The multiplication is a result of intensified agriculture, increased energy consumption and population growth. Industrial ecology (IE) is a discipline, in which human interaction with the ecosystems is investigated using a systems analytical approach. The main idea behind IE is that industrial systems resemble ecosystems, and, like them, industrial systems can then be described using material, energy and information flows and stocks. Industrial systems are dependent on the resources provided by the biosphere, and these two cannot be separated from each other. When studying substance flows, the aims of the research from the viewpoint of IE can be, for instance, to elucidate the ways how the cycles of a certain substance could be more closed and how the flows of a certain substance could be decreased per unit of production (= dematerialization). In Finland, N and P are studied widely in different ecosystems and environmental emissions. A holistic picture comparing different societal systems is, however, lacking. In this thesis, flows of N and P were examined in Finland using substance flow analysis (SFA) in the following four subsystems: I) forest industry and use of wood fuels, II) food production and consumption, III) energy, and IV) municipal waste. A detailed analysis at the end of the 1990s was performed. Furthermore, historical development of the N and P flows was investigated in the energy system (III) and the municipal waste system (IV). The main research sources were official statistics, literature, monitoring data, and expert knowledge. The aim was to identify and quantify the main flows of N and P in Finland in the four subsystems studied. Furthermore, the aim was to elucidate whether the nutrient systems are cyclic or linear, and to identify how these systems could be more efficient in the use and cycling of N and P. A final aim was to discuss how this type of an analysis can be used to support decision-making on environmental problems and solutions. Of the four subsystems, the food production and consumption system and the energy system created the largest N flows in Finland. For the creation of P flows, the food production and consumption system (Paper II) was clearly the largest, followed by the forest industry and use of wood fuels and the energy system. The contribution of Finland to N and P flows on a global scale is low, but when compared on a per capita basis, we are one of the largest producers of these flows, with relatively high energy and meat consumption being the main reasons. Analysis revealed the openness of all four systems. The openness is due to the high degree of internationality of the Finnish markets, the large-scale use of synthetic fertilizers and energy resources and the low recycling rate of many waste fractions. Reduction in the use of fuels and synthetic fertilizers, reorganization of the structure of energy production, reduced human intake of nutrients and technological development are crucial in diminishing the N and P flows. To enhance nutrient recycling and replace inorganic fertilizers, recycling of such wastes as wood ash and sludge could be promoted. SFA is not usually sufficiently detailed to allow specific recommendations for decision-making to be made, but it does yield useful information about the relative magnitude of the flows and may reveal unexpected losses. Sustainable development is a widely accepted target for all human action. SFA is one method that can help to analyse how effective different efforts are in leading to a more sustainable society. SFA's strength is that it allows a holistic picture of different natural and societal systems to be drawn. Furthermore, when the environmental impact of a certain flow is known, the method can be used to prioritize environmental policy efforts.
Resumo:
This article deals with a simulation-based Study of the impact of projectiles on thin aluminium plates using LS-DYNA by modelling plates with shell elements and projectiles with solid elements. In order to establish the required modelling criterion in terms of element size for aluminium plates, a convergence Study of residual velocity has been carried Out by varying mesh density in the impact zone. Using the preferred material and meshing criteria arrived at here, extremely good prediction of test residual velocities and ballistic limits given by Gupta et al. (2001) for thin aluminium plates has been obtained. The simulation-based pattern of failure with localized bulging and jagged edge of perforation is similar to the perforation with petalling seen in tests. A number Of simulation-based parametric studies have been carried out and results consistent with published test data have been obtained. Despite the robust correlation achieved against published experimental results, it would be prudent to conduct one's own experiments, for a final correlation via the present modelling procedure and analysis with the explicit LS-DYNTA 970 solver. Hence, a sophisticated ballistic impact testing facility and a high-speed camera have been used to conduct additional tests on grade 1100 aluminium plates of 1 mm thickness with projectiles Of four different nose shapes. Finally, using the developed numerical simulation procedure, an excellent correlation of residual velocity and failure modes with the corresponding test results has been obtained.
Resumo:
Multiple sclerosis (MS) is an immune-mediated demyelinating disorder of the central nervous system (CNS) affecting 0.1-0.2% of Northern European descent population. MS is considered to be a multifactorial disease, both environment and genetics play a role in its pathogenesis. Despite several decades of intense research, the etiological and pathogenic mechanisms underlying MS remain still largely unknown and no curative treatment exists. The genetic architecture underlying MS is complex with multiple genes involved. The strongest and the best characterized predisposing genetic factors for MS are located, as in other immune-mediated diseases, in the major histocompatibility complex (MHC) on chromosome 6. In humans MHC is called human leukocyte antigen (HLA). Alleles of the HLA locus have been found to associate strongly with MS and remained for many years the only consistently replicable genetic associations. However, recently other genes located outside the MHC region have been proposed as strong candidates for susceptibility to MS in several studies. In this thesis a new genetic locus located on chromosome 7q32, interferon regulatory factor 5 (IRF5), was identified in the susceptibility to MS. In particular, we found that common variation of the gene was associated with the disease in three different populations, Spanish, Swedish and Finnish. We also suggested a possible functional role for one of the risk alleles with impact on the expression of the IRF5 locus. Previous studies have pointed out a possible role played by chromosome 2q33 in the susceptibility to MS and other autoimmune disorders. The work described here also investigated the involvement of this chromosomal region in MS predisposition. After the detection of genetic association with 2q33 (article-1), we extended our analysis through fine-scale single nucleotide polymorphism (SNP) mapping to define further the contribution of this genomic area to disease pathogenesis (article-4). We found a trend (p=0.04) for association to MS with an intronic SNP located in the inducible T-cell co-stimulator (ICOS) gene, an important player in the co-stimulatory pathway of the immune system. Expression analysis of ICOS revealed a novel, previously uncharacterized, alternatively spliced isoform, lacking the extracellular domain that is needed for ligand binding. The stability of the newly-identified transcript variant and its subcellular localization were analyzed. These studies indicated that the novel isoform is stable and shows different subcellular localization as compared to full-length ICOS. The novel isoform might have a regulatory function, but further studies are required to elucidate its function. Chromosome 19q13 has been previously suggested as one of the genomic areas involved in MS predisposition. In several populations, suggestive linkage signals between MS predisposition and 19q13 have been obtained. Here, we analysed the role of allelic variation in 19q13 by family based association analysis in 782 MS families collected from Finland. In this dataset, we were not able to detect any statistically significant associations, although several previously suggested markers were included to the analysis. Replication of the previous findings on the basis of linkage disequilibrium between marker allele and disease/risk allele appears notoriously difficult because of limitations such as allelic heterogeneity. Re-sequencing based approaches may be required for elucidating the role of chromosome 19q13 with MS. This thesis has resulted in the identification of a new MS susceptibility locus (IRF5) previously associated with other inflammatory or autoimmune disorders, such as SLE. IRF5 is one of the mediators of interferons biological function. In addition to providing new insight in the possible pathogenetic pathway of the disease, this finding suggests that there might be common mechanisms between different immune-mediated disorders. Furthermore the work presented here has uncovered a novel isoform of ICOS, which may play a role in regulatory mechanisms of ICOS, an important mediator of lymphocyte activation. Further work is required to uncover its functions and possible involvement of the ICOS locus in MS susceptibility.
Resumo:
The paper presents a geometry-free approach to assess the variation of covariance matrices of undifferenced triple frequency GNSS measurements and its impact on positioning solutions. Four independent geometryfree/ ionosphere-free (GFIF) models formed from original triple-frequency code and phase signals allow for effective computation of variance-covariance matrices using real data. Variance Component Estimation (VCE) algorithms are implemented to obtain the covariance matrices for three pseudorange and three carrier-phase signals epoch-by-epoch. Covariance results from the triple frequency Beidou System (BDS) and GPS data sets demonstrate that the estimated standard deviation varies in consistence with the amplitude of actual GFIF error time series. The single point positioning (SPP) results from BDS ionosphere-free measurements at four MGEX stations demonstrate an improvement of up to about 50% in Up direction relative to the results based on a mean square statistics. Additionally, a more extensive SPP analysis at 95 global MGEX stations based on GPS ionosphere-free measurements shows an average improvement of about 10% relative to the traditional results. This finding provides a preliminary confirmation that adequate consideration of the variation of covariance leads to the improvement of GNSS state solutions.
Resumo:
This paper presents an experimental investigation on the lateral impact performance of axially loaded concrete-filled double-skin tube (CFDST) columns. These columns have desirable structural and constructional properties and have been used as columns in building, legs of off shore platforms and as bridge piers. Since they could be vulnerable to impact from passing vessels or vehicles, it is necessary to understand their behaviour under lateral impact loads. With this in mind, an experimental method employing an innovative instrumented horizontal impact testing system (HITS) was developed to apply lateral impact loads whilst the column maintained a static axial pre-loading to examine the failure mechanism and key response parameters of the column. These included the time histories of impact force, reaction forces, global lateral deflection and permanent local buckling profile. Eight full scale columns were tested for key parameters including the axial load level and impact location. Based on the test data, the failure mode, peak impact force, impact duration, peak reaction forces, reaction force duration, column maximum and residual global deflections and column local buckling length, depth and width under varying conditions are analysed and discussed. It is evident that the innovative HITS can successfully test structural columns under the combination of axial pre-loading and impact loading. The findings on the lateral impact response of the CFDST columns can serve as a benchmark reference for their future analysis and design.
Resumo:
Purpose The purpose of this paper is to reduce the potential for litigation by improving valuers’ awareness of water risks. As part of a valuer’s due diligence, the paper provides guidance as to how to identify such risks by explaining the different types and examining how online search tools can be used in conjunction with more traditional methods to evaluate the probability of these risks occurring. Design/methodology/approach The paper builds on prior research, which examined the impact of water to and for valuations. By means of legal/doctrinal analysis, this paper considers relevant issues from the perspective of managing client expectations and needs. In so doing it identifies online tools available to assist in identifying at risk properties and better informing clients. Findings While the internet provides a variety of tools to gain access to relevant information, this information most commonly is only provided subject to disclaimer. Valuers need to ensure that blind reliance is not given to use of these tools but that the tools are used in conjunction with individual property inspections. Research limitations/implications Although the examples considered primarily are Australian, increasing water risks generally make the issues considered relevant for any jurisdiction. The research will be of particular interests to practitioners in coastal or riverine areas. Practical implications Valuation reports are sought for a variety of purposes from a variety of clients. These range from the experienced, knowledgeable developer looking to maximise available equity to the inexperienced, uneducated individual looking to acquire their home and thinking more often than not with their heart not their head. More informed practices by valuers will lead to valuation reports being more easily understood by clients, thus lessening the likelihood of litigation against the valuer for negligence. Originality/value The paper highlights the issue of water risks; the need for valuers to properly address potential and actual risks in their reports; and the corresponding need to undertake all appropriate searches and enquiries of the property to be valued. It reinforces the importance of access to the internet as a tool in the valuation process.
Resumo:
This research treats the lateral impact behaviour of composite columns, which find increasing use as bridge piers and building columns. It offers (1) innovative experimental methods for testing structural columns, (2) dynamic computer simulation techniques as a viable tool in analysis and design of such columns and (3) significant new information on their performance which can be used in design. The research outcomes will enable to protect lives and properties against the risk of vehicular impacts caused either accidentally or intentionally.
Resumo:
This paper presents an experimental investigation on the lateral impact response of axially loaded concrete filled double skin tube (CFDST) columns. A total of four test series are being conducted at Queensland University of Technology using a novel horizontal impact-testing rig. The test results reported in this paper are from the first test series, where the columns are pinned at both ends and impacted at mid-span. In the next three series, effects of support conditions, impact location and repeated impact will be treated. The main objectives of the current paper are to describe the innovative testing procedure and provide some insight into the lateral impact behavior and failure of simply supported axially pre-loaded CFDST columns. The results include time histories of impact forces, reaction forces, axial force and global lateral deflection. Based on the test data, the failure mode, peak impact force, peak reaction forces, maximum deflection and residual deflection, with and without axial load, are analyzed and discussed. The findings of this study will serve as a benchmark reference for future analysis and design of CFDST columns.
Resumo:
Concrete-filled double skin tube (CFDST) is a creative innovation of steel-concrete-steel composite construction, formed by two concentric steel tubes separated by a concrete filler. Over the recent years, this column form has been widely used as a new sustainable alternative to existing structural bridge piers and building columns. Since they could be vulnerable to impact from passing vessels or vehicles, it is necessary to understand their behaviour under lateral impact loads. With this in mind, physical tests on full scale columns were performed using an innovative horizontal impact testing system to obtain the failure modes, the time history of the impact force, reaction forces and global lateral deflection as well as permanent local buckling profile of the columns. The experimental testing was complemented and supplemented by developing and using an advanced finite element analysis model. The model was validated by comparing the numerical results against experimental data. The findings of this study will serve as a benchmark reference for future analysis and design of CFDST columns.
Resumo:
Cool roof coatings have a beneficial impact on reducing the heat load of a range of building types, resulting in reduced cooling energy loads. This study seeks to understand the extent to which cool roof coatings could be used as a residential demand side management (DSM) strategy for retrofitting existing housing in a constrained network area in tropical Australia where peak electrical demand is heavily influenced by residential cooling loads. In particular this study seeks to determine whether simulation software used for building regulation purposes can provide networks with the ‘impact certainty’ required by their DSM principles. The building simulation method is supported by a field experiment. Both numerical and experimental data confirm reductions in total consumption (kWh) and energy demand (kW). The nature of the regulated simulation software, combined with the diverse nature of residential buildings and their patterns of occupancy, however, mean that simulated results cannot be extrapolated to quantify benefits to a broader distribution network. The study suggests that building data gained from regulatory simulations could be a useful guide for potential impacts of widespread application of cool roof coatings in this region. The practical realization of these positive impacts, however, would require changes to the current business model for the evaluation of DSM strategies. The study provides seven key recommendations that encourage distribution networks to think beyond their infrastructure boundaries, recognising that the broader energy system also includes buildings, appliances and people.
Resumo:
Hedonic property price analysis tells us that property prices can be affected by natural hazards such as floods. This paper examines the impact of flood-related variables (among other factors) on property values, and examines the effect of the release of flood risk map information on property values by comparing the impact with the effect of an actual flood incidence. An examination of the temporal variation of flood impacts on property values is also made. The study is the first of its kind where the impact of the release of flood risk map information to the public is compared with an actual flood incident. In this study, we adopt a spatial quasi-experimental analysis using the release of flood risk maps by Brisbane City Council in Queensland, Australia, in 2009 and the actual floods of 2011. The results suggest that property buyers are more responsive to the actual incidence of floods than to the disclosure of information to the public on the risk of floods.
Resumo:
The first quarter of the 20th century witnessed a rebirth of cosmology, study of our Universe, as a field of scientific research with testable theoretical predictions. The amount of available cosmological data grew slowly from a few galaxy redshift measurements, rotation curves and local light element abundances into the first detection of the cos- mic microwave background (CMB) in 1965. By the turn of the century the amount of data exploded incorporating fields of new, exciting cosmological observables such as lensing, Lyman alpha forests, type Ia supernovae, baryon acoustic oscillations and Sunyaev-Zeldovich regions to name a few. -- CMB, the ubiquitous afterglow of the Big Bang, carries with it a wealth of cosmological information. Unfortunately, that information, delicate intensity variations, turned out hard to extract from the overall temperature. Since the first detection, it took nearly 30 years before first evidence of fluctuations on the microwave background were presented. At present, high precision cosmology is solidly based on precise measurements of the CMB anisotropy making it possible to pinpoint cosmological parameters to one-in-a-hundred level precision. The progress has made it possible to build and test models of the Universe that differ in the way the cosmos evolved some fraction of the first second since the Big Bang. -- This thesis is concerned with the high precision CMB observations. It presents three selected topics along a CMB experiment analysis pipeline. Map-making and residual noise estimation are studied using an approach called destriping. The studied approximate methods are invaluable for the large datasets of any modern CMB experiment and will undoubtedly become even more so when the next generation of experiments reach the operational stage. -- We begin with a brief overview of cosmological observations and describe the general relativistic perturbation theory. Next we discuss the map-making problem of a CMB experiment and the characterization of residual noise present in the maps. In the end, the use of modern cosmological data is presented in the study of an extended cosmological model, the correlated isocurvature fluctuations. Current available data is shown to indicate that future experiments are certainly needed to provide more information on these extra degrees of freedom. Any solid evidence of the isocurvature modes would have a considerable impact due to their power in model selection.
Resumo:
Background Traffic offences have been considered an important predictor of crash involvement, and have often been used as a proxy safety variable for crashes. However the association between crashes and offences has never been meta-analysed and the population effect size never established. Research is yet to determine the extent to which this relationship may be spuriously inflated through systematic measurement error, with obvious implications for researchers endeavouring to accurately identify salient factors predictive of crashes. Methodology and Principal Findings Studies yielding a correlation between crashes and traffic offences were collated and a meta-analysis of 144 effects drawn from 99 road safety studies conducted. Potential impact of factors such as age, time period, crash and offence rates, crash severity and data type, sourced from either self-report surveys or archival records, were considered and discussed. After weighting for sample size, an average correlation of r = .18 was observed over the mean time period of 3.2 years. Evidence emerged suggesting the strength of this correlation is decreasing over time. Stronger correlations between crashes and offences were generally found in studies involving younger drivers. Consistent with common method variance effects, a within country analysis found stronger effect sizes in self-reported data even controlling for crash mean. Significance The effectiveness of traffic offences as a proxy for crashes may be limited. Inclusion of elements such as independently validated crash and offence histories or accurate measures of exposure to the road would facilitate a better understanding of the factors that influence crash involvement.
Resumo:
Evaluation and design of shore protection works in the case of tsunamis assumes considerable importance in view of the impact it had in the recent tsunami of 26th December 2004 in India and other countries in Asia. The fact that there are no proper guidelines have made in the matters worse and resulted in the magnitude of damage that occurred. Survey of the damages indicated that the scour as a result of high velocities is one of the prime reasons for damages in the case of simple structures. It is revealed that sea walls in some cases have been helpful to minimize the damages. The objective of this paper is to suggest that design of shore line protection systems using expected wave heights that get generated and use of flexible systems such as geocells is likely to give a better protection. The protection systems can be designed to withstand the wave forces that corresponding to different probabilities of incidence. A design approach of geocells protection system is suggested and illustrated with reference to the data of wave heights in the east coast of India.
Resumo:
Nowadays any analysis of Russian economy is incomplete without taking into account the phenomenon of oligarchy. Russian oligarchs appeared after the fall of the Soviet Union and are represented by wealthy businessmen who control a huge part of natural resources enterprises and have a big political influence. Oligarchs’ shares in some natural resources industries reach even 70-80%. Their role in Russian economy is big without any doubts, however there has been very little economic analysis done. The aim of this work is to examine Russian oligarchy on micro and macro levels, its role in Russia’s transition and the possible positive and negative outcomes from this phenomenon. For this purpose the work presents two theoretical models. The first part of this thesis work examines the role of oligarchs on micro level, concentrating on the question whether the oligarchs can be more productive owners than other types of owners. To answer the question this part presents a model based on the article “Are oligarchs productive? Theory and evidence” by Y. Gorodnichenko and Y. Grygorenko. It is followed by empirical test based on the works of S. Guriev and A. Rachinsky. The model predicts oligarchs to invest more in the productivity of their enterprises and have higher returns on capital, therefore be more productive owners. According to the empirical test, oligarchs were found to outperform other types of owners, however it is not defined whether the productivity gains offset losses in tax revenue. The second part of the work concentrates on the role of oligarchy on macro level. More precisely, it examines the assumption that the depression after 1998 crises in Russia was caused by the oligarchs’ behavior. This part presents a theoretical model based on the article “A macroeconomic model of Russian transition: The role of oligarchic property rights” by S. Braguinsky and R. Myerson, where the special type of property rights is introduced. After the 1998 crises oligarchs started to invest all their resources abroad to protect themselves from political risks, which resulted in the long depression phase. The macroeconomic model shows, that better protection of property rights (smaller political risk) or/and higher outside investing could reduce the depression. Taking into account this result, the government policy can change the oligarchs’ behavior to be more beneficial for the Russian economy and make the transition faster.