234 resultados para Àrid
Resumo:
Extensive losses of coastal wetlands in the United States caused by sea-level rise, land subsidence, erosion, and coastal development have increased hterest in the creation of salt marshes within estuaries. Smooth cordgrass Spartina altemiflora is the species utilized most for salt marsh creation and restoration throughout the Atlantic and Gulf coasts of the U.S., while S. foliosa and Salicomia virginica are often used in California. Salt marshes have many valuable functions such as protecting shorelines from erosion, stabilizing deposits of dredged material, dampening flood effects, trapping water-born sediments, serving as nutrient reservoirs, acting as tertiary water treatment systems to rid coastal waters of contaminants, serving as nurseries for many juvenile fish and shellfish species, and serving as habitat for various wildlife species (Kusler and Kentula 1989). The establishment of vegetation in itself is generally sufficient to provide the functions of erosion control, substrate stabilization, and sediment trapping. The development of other salt marsh functions, however, is more difficult to assess. For example, natural estuarine salt marshes support a wide variety of fish and shellfish, and the abundance of coastal marshes has been correlated with fisheries landings (Turner 1977, Boesch and Turner 1984). Marshes function for aquatic species by providing breeding areas, refuges from predation, and rich feeding grounds (Zimmerman and Minello 1984, Boesch and Turner 1984, Kneib 1984, 1987, Minello and Zimmerman 1991). However, the relative value of created marshes versus that of natural marshes for estuarine animals has been questioned (Carnmen 1976, Race and Christie 1982, Broome 1989, Pacific Estuarine Research Laboratory 1990, LaSalle et al. 1991, Minello and Zimmerman 1992, Zedler 1993). Restoration of all salt marsh functions is necessary to prevent habitat creation and restoration activities from having a negative impact on coastal ecosystems.
Resumo:
This report presents findings of the first CAS in the Ugandan waters following the agreed SOPs, carried out in July 2005. The findings indicate a total fish catch of 15,047.5 t for July 2005, contributed by Mukene/Dagaa (39.5%), Nile perch (33.1%), Tilapias (17.1%), Haplochromines (9.2%) and other fish species 1.2%. This information gives a new perspective of the estimates of fish production in the Ugandan waters of the lake which are based on field observations. Continuation of support to the CAS programme will certainly get rid of the uncertainties about the fish production levels of the lake which have been there for a long time. This information is vital for fisheries development and fisheries management endeavours.
Resumo:
in experiment, characteristics of silicon microring/racetrack resonators in submicron rib waveguides have been systematically investigated. It is demonstrated that only a transverse-electric mode is guided for a ratio of slab height to rib height h/H = 0.5. Thus, these microring/racetrack resonators can only function for quasi-transverse-electric mode, while they get rid of transverse-magnetic polarization. Electron beam lithography and inductively coupled plasma etching were employed and improved to reduce side-wall roughness for low propagation loss and high performance resonators. Then, the effects of waveguide dimensions, coupling region design, waveguide roughness, and oxide cladding for the resonators have been considered and analyzed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Particle Swarm Optimization (PSO) algorithm is often used for finding optimal solution, but it easily entraps into the local extremum in later evolution period. Based on improved chaos searching strategy, an enhanced particle swarm optimization algorithm is proposed in this study. When particles get into the local extremum, they are activated by chaos search strategy, where the chaos search area is controlled in the neighborhood of current optimal solution by reducing search area of variables. The new algorithm not only gets rid of the local extremum effectively but also enhances the precision of convergence significantly. Experiment results show that the proposed algorithm is better than standard PSO algorithm in both precision and stability.
Resumo:
根据中国西部森林的现状 ,采用生态学与地理学的方法 ,分析了因毁林开荒造成水土流失、洪涝灾害、沙尘风暴、干旱少雨、江河断流、物种减少等危害对国民经济发展所造成的影响 ,以及近几年所产生的一系列生态环境问题 .探讨了退耕还林 (草 )对我国的生态环境治理、西部持续发展、江河整治、国土整治综合措施运用、西部农民脱贫致富的作用 .
Resumo:
在面向目标追踪等应用的无线传感器网络研究中,协同任务分配机制的研究是很重要的。基于动态联盟机制的协同任务分配方法是事件触发的,适用于任务出现频率相对较低的大规模无线传感器网络。本文在基于动态联盟机制研究的基础上,首先引入了联盟覆盖范围和休眠盟员的概念,进一步消除针对同一任务的检测传感器节点的冗余,降低系统的能量消耗;而后又给出了一种动态联盟的更新机制,以保证动态联盟执行任务时的连续性,在一定程度上保证网络的检测性能。最后通过仿真,从系统总能耗、目标捕获率和检测误差标准差等方面检验了算法的性能,并给出了缓冲带宽度等参数对能耗和网络检测性能的影响。
Resumo:
由于发动机光谱分析监控数据中磨损微粒种类过多,如果将这些微粒信息直接作为神经网络的输入,则存在输入层神经元过多、网络结构复杂等诸多问题。本文将粗糙集引入到发动机故障诊断中来,利用粗糙集在属性约简方面的优势,删除冗余磨损微粒,提取出重要磨损微粒,并将其作为BP神经网络的输入,建立发动机故障诊断模型。该方法降低输入层的神经元个数,简化了网络结构,缩短网络训练时间,并且由于剔除了冗余磨损微粒,减少了由该部分微粒信息不准确而带来的误差,有效提高了故障诊断的精确度。最后通过算例分析验证了相关算法和诊断模型的准确性和有效性。
Resumo:
In modem signal Processing,non-linear,non-Gaussian and non-stable signals are usually the analyzed and Processed objects,especially non-stable signals. The convention always to analyze and Process non-stable signals are: short time Fourier transform,Wigner-Ville distribution,wavelet Transform and so on. But the above three algorithms are all based on Fourier Transform,so they all have the shortcoming of Fourier Analysis and cannot get rid of the localization of it. Hilbert-Huang Transform is a new non-stable signal processing technology,proposed by N. E. Huang in 1998. It is composed of Empirical Mode Decomposition (referred to as EMD) and Hilbert Spectral Analysis (referred to as HSA). After EMD Processing,any non-stable signal will be decomposed to a series of data sequences with different scales. Each sequence is called an Intrinsic Mode Function (referred to as IMF). And then the energy distribution plots of the original non-stable signal can be found by summing all the Hilbert spectrums of each IMF. In essence,this algorithm makes the non-stable signals become stable and decomposes the fluctuations and tendencies of different scales by degrees and at last describes the frequency components with instantaneous frequency and energy instead of the total frequency and energy in Fourier Spectral Analysis. In this case,the shortcoming of using many fake harmonic waves to describe non-linear and non-stable signals in Fourier Transform can be avoided. This Paper researches in the following parts: Firstly,This paper introduce the history and development of HHT,subsequently the characters and main issues of HHT. This paper briefly introduced the basic realization principles and algorithms of Hilbert-Huang transformation and confirms its validity by simulations. Secondly, This paper discuss on some shortcoming of HHT. By using FFT interpolation, we solve the problem of IMF instability and instantaneous frequency undulate which are caused by the insufficiency of sampling rate. As to the bound effect caused by the limitation of envelop algorithm of HHT, we use the wave characteristic matching method, and have good result. Thirdly, This paper do some deeply research on the application of HHT in electromagnetism signals processing. Based on the analysis of actual data examples, we discussed its application in electromagnetism signals processing and noise suppression. Using empirical mode decomposition method and multi-scale filter characteristics can effectively analyze the noise distribution of electromagnetism signal and suppress interference processing and information interpretability. It has been founded that selecting electromagnetism signal sessions using Hilbert time-frequency energy spectrum is helpful to improve signal quality and enhance the quality of data.
Resumo:
Explanation-based Generalization requires that the learner obtain an explanation of why a precedent exemplifies a concept. It is, therefore, useless if the system fails to find this explanation. However, it is not necessary to give up and resort to purely empirical generalization methods. In fact, the system may already know almost everything it needs to explain the precedent. Learning by Failing to Explain is a method which is able to exploit current knowledge to prune complex precedents, isolating the mysterious parts of the precedent. The idea has two parts: the notion of partially analyzing a precedent to get rid of the parts which are already explainable, and the notion of re-analyzing old rules in terms of new ones, so that more general rules are obtained.
Resumo:
74 Front Street is intended as a musical response to the work of American sculptor Fred Sandback (1943-2003), in particular his string sculptures which are intentionally uncomplicated, minimal and strikingly beautiful. The relationship between these string works and music is rather distinct; indeed, before formally studying sculpture Sandback made a number of string instruments. He has said that his string works evolved from a desire to rid art of excessive decoration and, while there are instances of activity in 74 Front Street, much of the piece aims to reflect the simplicity of Sandback’s work.
Resumo:
Case-Based Reasoning (CBR) uses past experiences to solve new problems. The quality of the past experiences, which are stored as cases in a case base, is a big factor in the performance of a CBR system. The system's competence may be improved by adding problems to the case base after they have been solved and their solutions verified to be correct. However, from time to time, the case base may have to be refined to reduce redundancy and to get rid of any noisy cases that may have been introduced. Many case base maintenance algorithms have been developed to delete noisy and redundant cases. However, different algorithms work well in different situations and it may be difficult for a knowledge engineer to know which one is the best to use for a particular case base. In this thesis, we investigate ways to combine algorithms to produce better deletion decisions than the decisions made by individual algorithms, and ways to choose which algorithm is best for a given case base at a given time. We analyse five of the most commonly-used maintenance algorithms in detail and show how the different algorithms perform better on different datasets. This motivates us to develop a new approach: maintenance by a committee of experts (MACE). MACE allows us to combine maintenance algorithms to produce a composite algorithm which exploits the merits of each of the algorithms that it contains. By combining different algorithms in different ways we can also define algorithms that have different trade-offs between accuracy and deletion. While MACE allows us to define an infinite number of new composite algorithms, we still face the problem of choosing which algorithm to use. To make this choice, we need to be able to identify properties of a case base that are predictive of which maintenance algorithm is best. We examine a number of measures of dataset complexity for this purpose. These provide a numerical way to describe a case base at a given time. We use the numerical description to develop a meta-case-based classification system. This system uses previous experience about which maintenance algorithm was best to use for other case bases to predict which algorithm to use for a new case base. Finally, we give the knowledge engineer more control over the deletion process by creating incremental versions of the maintenance algorithms. These incremental algorithms suggest one case at a time for deletion rather than a group of cases, which allows the knowledge engineer to decide whether or not each case in turn should be deleted or kept. We also develop incremental versions of the complexity measures, allowing us to create an incremental version of our meta-case-based classification system. Since the case base changes after each deletion, the best algorithm to use may also change. The incremental system allows us to choose which algorithm is the best to use at each point in the deletion process.
Resumo:
This article aims to investigate contemporary cultural representations of the Beothuk Indians in art, literature and museum displays in Newfoundland, Canada, focussing on ways these reimagine the past for the present, offering perspectives on contested histories, such as the circumstances leading to the demise of the Beothuk. Wiped out through the impact of colonialism, the Beothuk are the ‘absent other’ who continue to be remembered and made present through the creative arts, largely at the expense of other indigenous groups on the island. Rather than focussing on the ‘non-absent past, according to Polish scholar Ewa Domańska, ‘instead we turn to a past that is somehow still present, that will not go away or, rather, that of which we cannot rid ourselves’ (2006, 346). Depictions of the last Beothuk are part of a cultural remembering where guilt and reconciliation are played out through media of the imagination
Resumo:
Bacterial outer membrane vesicles (OMVs) are spherical buds of the outer membrane (OM) containing periplasmic lumenal components. OMVs have been demonstrated to play a critical part in the transmission of virulence factors, immunologically active compounds, and bacterial survival, however vesiculation also appears to be a ubiquitous physiological process for Gram-negative bacteria. Despite their characterized biological roles, especially for pathogens, very little is known about their importance for the originating organism as well as regulation and mechanism of production. Only when we have established their biogenesis can we fully uncover their roles in pathogenesis and bacterial physiology. The overall goal of this research was to characterize bacterial mutants which display altered vesiculation phenotypes using genetic and biochemical techniques, and thereby begin to elucidate the mechanism of vesicle production and regulation. One part of this work elucidated a synthetic genetic growth defect for a strain with reduced OMV production (ΔnlpA, inner membrane lipoprotein with a minor role in methionine transport) and envelope stress (ΔdegP, dual function periplasmic chaperone/ protease responsible for managing proteinaceous waste). This research showed that the growth defect of ΔnlpAΔdegP correlated with reduced OMV production with respect to the hyprevesiculator ΔdegP and the accumulation of protein in the periplasm and DegP substrates in the lumen of OMVs. We further demonstrated that OMVs do not solely act as a stress response pathway to rid the periplasm of otherwise damaging misfolded protein but also of accumulated peptidoglycan (PG) fragments and lipopolysaccharide (LPS), elucidating OMVs as a general stress response pathway critical for bacterial well-being. The second part of this work, focused on the role of PG structure, turnover and covalent crosslinks to the OM in vesiculation. We established a direct link between PG degradation and vesiculation: Mutations in the OM lipoprotein nlpI had been previously established as a very strong hypervesiculation phenotype. In the literature NlpI had been associated with another OM lipoprotein, Spr that was recently identified as a PG hydrolase. The data presented here suggest that NlpI acts as a negative regulator of Spr and that the ΔnlpI hypervesiculation phenotype is a result of rampantly degraded PG by Spr. Additionally, we found that changes in PG structure and turnover correlate with altered vesiculation levels, as well as non-canonical D-amino acids, which are secreted by numerous bacteria on the onset of stationary phase, being a natural factor to increase OMV production. Furthermore, we discovered an inverse relationship between the concentration of Lpp-mediated, covalent crosslinks and the level of OMV production under conditions of modulated PG metabolism and structure. In contrast, situations that lead to periplasmic accumulation (protein, PG fragments, and LPS) and consequent hypervesiculation the overall OM-PG crosslink concentration appears to be unchanged. Form this work, we conclude that multiple pathways lead to OMV production: Lpp concentration-dependent and bulk driven, Lpp concentration-independent.
Resumo:
Environmental (222)radon exposure is a human health concern, and many studies demonstrate that very low doses of high LET alpha-particle irradiation initiate deleterious genetic consequences in both radiated and non-irradiated bystander cells. One consequence, radiation-induced genomic instability (RIGI), is a hallmark of tumorigenesis and is often assessed by measuring delayed chromosomal aberrations We utilised a technique that facilitates transient immobilization of primary lymphocytes for targeted microbeam irradiation and have reported that environmentally relevant doses, e.g. a single He-3(2+) particle traversal to a single cell, are sufficient to Induce RIGI Herein we sought to determine differences in radiation response in lymphocytes isolated from five healthy male donors Primary lymphocytes were irradiated with a single particle per cell nucleus. We found evidence for inter-individual variation in radiation response (Rid, measured as delayed chromosome aberrations) Although this was not highly significant, it was possibly masked by high levels of intra-individual variation While there are many studies showing a link between genetic predisposition and RIGI, there are few studies linking genetic background with bystander effects in normal human lymphocytes In an attempt to investigate inter-individual variation in the induction of bystander effects, primary lymphocytes were irradiated with a single particle under conditions where fractions of the population were traversed We showed a marked genotype-dependent bystander response in one donor after exposure to 15% of the population The findings may also be regarded as a radiation-induced genotype-dependent bystander effect triggering an instability phenotype (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In [M. Herty, A. Klein, S. Moutari, V. Schleper, and G. Steinaur, IMA J. Appl. Math., 78(5), 1087–1108, 2013] and [M. Herty and V. Schleper, ZAMM J. Appl. Math. Mech., 91, 763–776, 2011], a macroscopic approach, derived from fluid-dynamics models, has been introduced to infer traffic conditions prone to road traffic collisions along highways’ sections. In these studies, the governing equations are coupled within an Eulerian framework, which assumes fixed interfaces between the models. A coupling in Lagrangian coordinates would enable us to get rid of this (not very realistic) assumption. In this paper, we investigate the well-posedness and the suitability of the coupling of the governing equations within the Lagrangian framework. Further, we illustrate some features of the proposed approach through some numerical simulations.