962 resultados para Direct digital detector images


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : Images acquired from unmanned aerial vehicles (UAVs) can provide data with unprecedented spatial and temporal resolution for three-dimensional (3D) modeling. Solutions developed for this purpose are mainly operating based on photogrammetry concepts, namely UAV-Photogrammetry Systems (UAV-PS). Such systems are used in applications where both geospatial and visual information of the environment is required. These applications include, but are not limited to, natural resource management such as precision agriculture, military and police-related services such as traffic-law enforcement, precision engineering such as infrastructure inspection, and health services such as epidemic emergency management. UAV-photogrammetry systems can be differentiated based on their spatial characteristics in terms of accuracy and resolution. That is some applications, such as precision engineering, require high-resolution and high-accuracy information of the environment (e.g. 3D modeling with less than one centimeter accuracy and resolution). In other applications, lower levels of accuracy might be sufficient, (e.g. wildlife management needing few decimeters of resolution). However, even in those applications, the specific characteristics of UAV-PSs should be well considered in the steps of both system development and application in order to yield satisfying results. In this regard, this thesis presents a comprehensive review of the applications of unmanned aerial imagery, where the objective was to determine the challenges that remote-sensing applications of UAV systems currently face. This review also allowed recognizing the specific characteristics and requirements of UAV-PSs, which are mostly ignored or not thoroughly assessed in recent studies. Accordingly, the focus of the first part of this thesis is on exploring the methodological and experimental aspects of implementing a UAV-PS. The developed system was extensively evaluated for precise modeling of an open-pit gravel mine and performing volumetric-change measurements. This application was selected for two main reasons. Firstly, this case study provided a challenging environment for 3D modeling, in terms of scale changes, terrain relief variations as well as structure and texture diversities. Secondly, open-pit-mine monitoring demands high levels of accuracy, which justifies our efforts to improve the developed UAV-PS to its maximum capacities. The hardware of the system consisted of an electric-powered helicopter, a high-resolution digital camera, and an inertial navigation system. The software of the system included the in-house programs specifically designed for camera calibration, platform calibration, system integration, onboard data acquisition, flight planning and ground control point (GCP) detection. The detailed features of the system are discussed in the thesis, and solutions are proposed in order to enhance the system and its photogrammetric outputs. The accuracy of the results was evaluated under various mapping conditions, including direct georeferencing and indirect georeferencing with different numbers, distributions and types of ground control points. Additionally, the effects of imaging configuration and network stability on modeling accuracy were assessed. The second part of this thesis concentrates on improving the techniques of sparse and dense reconstruction. The proposed solutions are alternatives to traditional aerial photogrammetry techniques, properly adapted to specific characteristics of unmanned, low-altitude imagery. Firstly, a method was developed for robust sparse matching and epipolar-geometry estimation. The main achievement of this method was its capacity to handle a very high percentage of outliers (errors among corresponding points) with remarkable computational efficiency (compared to the state-of-the-art techniques). Secondly, a block bundle adjustment (BBA) strategy was proposed based on the integration of intrinsic camera calibration parameters as pseudo-observations to Gauss-Helmert model. The principal advantage of this strategy was controlling the adverse effect of unstable imaging networks and noisy image observations on the accuracy of self-calibration. The sparse implementation of this strategy was also performed, which allowed its application to data sets containing a lot of tie points. Finally, the concepts of intrinsic curves were revisited for dense stereo matching. The proposed technique could achieve a high level of accuracy and efficiency by searching only through a small fraction of the whole disparity search space as well as internally handling occlusions and matching ambiguities. These photogrammetric solutions were extensively tested using synthetic data, close-range images and the images acquired from the gravel-pit mine. Achieving absolute 3D mapping accuracy of 11±7 mm illustrated the success of this system for high-precision modeling of the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mammography equipment must be evaluated to ensure that images will be of acceptable diagnostic quality with lowest radiation dose. Quality Assurance (QA) aims to provide systematic and constant improvement through a feedback mechanism to address the technical, clinical and training aspects. Quality Control (QC), in relation to mammography equipment, comprises a series of tests to determine equipment performance characteristics. The introduction of digital technologies promoted changes in QC tests and protocols and there are some tests that are specific for each manufacturer. Within each country specifi c QC tests should be compliant with regulatory requirements and guidance. Ideally, one mammography practitioner should take overarching responsibility for QC within a service, with all practitioners having responsibility for actual QC testing. All QC results must be documented to facilitate troubleshooting, internal audit and external assessment. Generally speaking, the practitioner’s role includes performing, interpreting and recording the QC tests as well as reporting any out of action limits to their service lead. They must undertake additional continuous professional development to maintain their QC competencies. They are usually supported by technicians and medical physicists; in some countries the latter are mandatory. Technicians and/or medical physicists often perform many of the tests indicated within this chapter. It is important to recognise that this chapter is an attempt to encompass the main tests performed within European countries. Specific tests related to the service that you work within must be familiarised with and adhered too.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phase change problems arise in many practical applications such as air-conditioning and refrigeration, thermal energy storage systems and thermal management of electronic devices. The physical phenomenon in such applications are complex and are often difficult to be studied in detail with the help of only experimental techniques. The efforts to improve computational techniques for analyzing two-phase flow problems with phase change are therefore gaining momentum. The development of numerical methods for multiphase flow has been motivated generally by the need to account more accurately for (a) large topological changes such as phase breakup and merging, (b) sharp representation of the interface and its discontinuous properties and (c) accurate and mass conserving motion of the interface. In addition to these considerations, numerical simulation of multiphase flow with phase change introduces additional challenges related to discontinuities in the velocity and the temperature fields. Moreover, the velocity field is no longer divergence free. For phase change problems, the focus of developmental efforts has thus been on numerically attaining a proper conservation of energy across the interface in addition to the accurate treatment of fluxes of mass and momentum conservation as well as the associated interface advection. Among the initial efforts related to the simulation of bubble growth in film boiling applications the work in \cite{Welch1995} was based on the interface tracking method using a moving unstructured mesh. That study considered moderate interfacial deformations. A similar problem was subsequently studied using moving, boundary fitted grids \cite{Son1997}, again for regimes of relatively small topological changes. A hybrid interface tracking method with a moving interface grid overlapping a static Eulerian grid was developed \cite{Juric1998} for the computation of a range of phase change problems including, three-dimensional film boiling \cite{esmaeeli2004computations}, multimode two-dimensional pool boiling \cite{Esmaeeli2004} and film boiling on horizontal cylinders \cite{Esmaeeli2004a}. The handling of interface merging and pinch off however remains a challenge with methods that explicitly track the interface. As large topological changes are crucial for phase change problems, attention has turned in recent years to front capturing methods utilizing implicit interfaces that are more effective in treating complex interface deformations. The VOF (Volume of Fluid) method was adopted in \cite{Welch2000} to simulate the one-dimensional Stefan problem and the two-dimensional film boiling problem. The approach employed a specific model for mass transfer across the interface involving a mass source term within cells containing the interface. This VOF based approach was further coupled with the level set method in \cite{Son1998}, employing a smeared-out Heaviside function to avoid the numerical instability related to the source term. The coupled level set, volume of fluid method and the diffused interface approach was used for film boiling with water and R134a at the near critical pressure condition \cite{Tomar2005}. The effect of superheat and saturation pressure on the frequency of bubble formation were analyzed with this approach. The work in \cite{Gibou2007} used the ghost fluid and the level set methods for phase change simulations. A similar approach was adopted in \cite{Son2008} to study various boiling problems including three-dimensional film boiling on a horizontal cylinder, nucleate boiling in microcavity \cite{lee2010numerical} and flow boiling in a finned microchannel \cite{lee2012direct}. The work in \cite{tanguy2007level} also used the ghost fluid method and proposed an improved algorithm based on enforcing continuity and divergence-free condition for the extended velocity field. The work in \cite{sato2013sharp} employed a multiphase model based on volume fraction with interface sharpening scheme and derived a phase change model based on local interface area and mass flux. Among the front capturing methods, sharp interface methods have been found to be particularly effective both for implementing sharp jumps and for resolving the interfacial velocity field. However, sharp velocity jumps render the solution susceptible to erroneous oscillations in pressure and also lead to spurious interface velocities. To implement phase change, the work in \cite{Hardt2008} employed point mass source terms derived from a physical basis for the evaporating mass flux. To avoid numerical instability, the authors smeared the mass source by solving a pseudo time-step diffusion equation. This measure however led to mass conservation issues due to non-symmetric integration over the distributed mass source region. The problem of spurious pressure oscillations related to point mass sources was also investigated by \cite{Schlottke2008}. Although their method is based on the VOF, the large pressure peaks associated with sharp mass source was observed to be similar to that for the interface tracking method. Such spurious fluctuation in pressure are essentially undesirable because the effect is globally transmitted in incompressible flow. Hence, the pressure field formation due to phase change need to be implemented with greater accuracy than is reported in current literature. The accuracy of interface advection in the presence of interfacial mass flux (mass flux conservation) has been discussed in \cite{tanguy2007level,tanguy2014benchmarks}. The authors found that the method of extending one phase velocity to entire domain suggested by Nguyen et al. in \cite{nguyen2001boundary} suffers from a lack of mass flux conservation when the density difference is high. To improve the solution, the authors impose a divergence-free condition for the extended velocity field by solving a constant coefficient Poisson equation. The approach has shown good results with enclosed bubble or droplet but is not general for more complex flow and requires additional solution of the linear system of equations. In current thesis, an improved approach that addresses both the numerical oscillation of pressure and the spurious interface velocity field is presented by featuring (i) continuous velocity and density fields within a thin interfacial region and (ii) temporal velocity correction steps to avoid unphysical pressure source term. Also I propose a general (iii) mass flux projection correction for improved mass flux conservation. The pressure and the temperature gradient jump condition are treated sharply. A series of one-dimensional and two-dimensional problems are solved to verify the performance of the new algorithm. Two-dimensional and cylindrical film boiling problems are also demonstrated and show good qualitative agreement with the experimental observations and heat transfer correlations. Finally, a study on Taylor bubble flow with heat transfer and phase change in a small vertical tube in axisymmetric coordinates is carried out using the new multiphase, phase change method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in the general equilibrium theory of multinationals emphasize the importance of multilateral considerations. Yet, existing explanations and corresponding estimations of FDI patterns have largely limited political and institutional investment impediments to a bilateral framework. Through the application of spatial econometric techniques, I demonstrate that the presence of both domestic and regional political uncertainty generate real options effects that lead to the delay or redirection of foreign direct investment. The magnitude and direction of these effects is conditional upon the host country regime type and the predominant multinational integration strategies in the region. Comparing these results with FDI of U.S. origin, I find evidence for divergent investment behavior by U.S. multinationals during regime changes in partner countries. Additionally, I find no evidence that multinationals from developing countries are more likely to complete cross-border deals in environments characterized by greater political risk or political uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we demonstrate a digital signal processing (DSP) algorithm for improving spatial resolution of images captured by CMOS cameras. The basic approach is to reconstruct a high resolution (HR) image from a shift-related low resolution (LR) image sequence. The aliasing relationship of Fourier transforms between discrete and continuous images in the frequency domain is used for mapping LR images to a HR image. The method of projection onto convex sets (POCS) is applied to trace the best estimate of pixel matching from the LR images to the reconstructed HR image. Computer simulations and preliminary experimental results have shown that the algorithm works effectively on the application of post-image-captured processing for CMOS cameras. It can also be applied to HR digital image reconstruction, where shift information of the LR image sequence is known.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of image retrieval and matching is to find and locate object instances in images from a large-scale image database. While visual features are abundant, how to combine them to improve performance by individual features remains a challenging task. In this work, we focus on leveraging multiple features for accurate and efficient image retrieval and matching. We first propose two graph-based approaches to rerank initially retrieved images for generic image retrieval. In the graph, vertices are images while edges are similarities between image pairs. Our first approach employs a mixture Markov model based on a random walk model on multiple graphs to fuse graphs. We introduce a probabilistic model to compute the importance of each feature for graph fusion under a naive Bayesian formulation, which requires statistics of similarities from a manually labeled dataset containing irrelevant images. To reduce human labeling, we further propose a fully unsupervised reranking algorithm based on a submodular objective function that can be efficiently optimized by greedy algorithm. By maximizing an information gain term over the graph, our submodular function favors a subset of database images that are similar to query images and resemble each other. The function also exploits the rank relationships of images from multiple ranked lists obtained by different features. We then study a more well-defined application, person re-identification, where the database contains labeled images of human bodies captured by multiple cameras. Re-identifications from multiple cameras are regarded as related tasks to exploit shared information. We apply a novel multi-task learning algorithm using both low level features and attributes. A low rank attribute embedding is joint learned within the multi-task learning formulation to embed original binary attributes to a continuous attribute space, where incorrect and incomplete attributes are rectified and recovered. To locate objects in images, we design an object detector based on object proposals and deep convolutional neural networks (CNN) in view of the emergence of deep networks. We improve a Fast RCNN framework and investigate two new strategies to detect objects accurately and efficiently: scale-dependent pooling (SDP) and cascaded rejection classifiers (CRC). The SDP improves detection accuracy by exploiting appropriate convolutional features depending on the scale of input object proposals. The CRC effectively utilizes convolutional features and greatly eliminates negative proposals in a cascaded manner, while maintaining a high recall for true objects. The two strategies together improve the detection accuracy and reduce the computational cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Mobile applications support a set of user-interaction features that are independent of the application logic. Rotating the device, scrolling, or zooming are examples of such features. Some bugs in mobile applications can be attributed to user-interaction features. Objective: This paper proposes and evaluates a bug analyzer based on user-interaction features that uses digital image processing to find bugs. Method: Our bug analyzer detects bugs by comparing the similarity between images taken before and after a user-interaction. SURF, an interest point detector and descriptor, is used to compare the images. To evaluate the bug analyzer, we conducted a case study with 15 randomly selected mobile applications. First, we identified user-interaction bugs by manually testing the applications. Images were captured before and after applying each user-interaction feature. Then, image pairs were processed with SURF to obtain interest points, from which a similarity percentage was computed, to finally decide whether there was a bug. Results: We performed a total of 49 user-interaction feature tests. When manually testing the applications, 17 bugs were found, whereas when using image processing, 15 bugs were detected. Conclusions: 8 out of 15 mobile applications tested had bugs associated to user-interaction features. Our bug analyzer based on image processing was able to detect 88% (15 out of 17) of the user-interaction bugs found with manual testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in digital photography and distribution technologies enable many people to produce and distribute images of their sex acts. When teenagers do this, the photos and videos they create can be legally classified as child pornography since the law makes no exception for youth who create sexually explicit images of themselves. The dominant discussions about teenage girls producing sexually explicit media (including sexting) are profoundly unproductive: (1) they blame teenage girls for creating private images that another person later maliciously distributed and (2) they fail to respect—or even discuss—teenagers’ rights to freedom of expression. Cell phones and the internet make producing and distributing images extremely easy, which provide widely accessible venues for both consensual sexual expression between partners and for sexual harassment. Dominant understandings view sexting as a troubling teenage trend created through the combination of camera phones and adolescent hormones and impulsivity, but this view often conflates consensual sexting between partners with the malicious distribution of a person’s private image as essentially equivalent behaviors. In this project, I ask: What is the role of assumptions about teen girls’ sexual agency in these problematic understandings of sexting that blame victims and deny teenagers’ rights? In contrast to the popular media panic about online predators and the familiar accusation that youth are wasting their leisure time by using digital media, some people champion the internet as a democratic space that offers young people the opportunity to explore identities and develop social and communication skills. Yet, when teen girls’ sexuality enters this conversation, all this debate and discussion narrows to a problematic consensus. The optimists about adolescents and technology fall silent, and the argument that media production is inherently empowering for girls does not seem to apply to a girl who produces a sexually explicit image of herself. Instead, feminist, popular, and legal commentaries assert that she is necessarily a victim: of a “sexualized” mass media, pressure from her male peers, digital technology, her brain structures or hormones, or her own low self-esteem and misplaced desire for attention. Why and how are teenage girls’ sexual choices produced as evidence of their failure or success in achieving Western liberal ideals of self-esteem, resistance, and agency? Since mass media and policy reactions to sexting have so far been overwhelmingly sexist and counter-productive, it is crucial to interrogate the concepts and assumptions that characterize mainstream understandings of sexting. I argue that the common sense that is co-produced by law and mass media underlies the problematic legal and policy responses to sexting. Analyzing a range of nonfiction texts including newspaper articles, talk shows, press releases, public service announcements, websites, legislative debates, and legal documents, I investigate gendered, racialized, age-based, and technologically determinist common sense assumptions about teenage girls’ sexual agency. I examine the consensus and continuities that exist between news, nonfiction mass media, policy, institutions, and law, and describe the limits of their debates. I find that this early 21st century post-feminist girl-power moment not only demands that girls live up to gendered sexual ideals but also insists that actively choosing to follow these norms is the only way to exercise sexual agency. This is the first study to date examining the relationship of conventional wisdom about digital media and teenage girls’ sexuality to both policy and mass media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding and measuring the interaction of light with sub-wavelength structures and atomically thin materials is of critical importance for the development of next generation photonic devices.  One approach to achieve the desired optical properties in a material is to manipulate its mesoscopic structure or its composition in order to affect the properties of the light-matter interaction.  There has been tremendous recent interest in so called two-dimensional materials, consisting of only a single to a few layers of atoms arranged in a planar sheet.  These materials have demonstrated great promise as a platform for studying unique phenomena arising from the low-dimensionality of the material and for developing new types of devices based on these effects.  A thorough investigation of the optical and electronic properties of these new materials is essential to realizing their potential.  In this work we present studies that explore the nonlinear optical properties and carrier dynamics in nanoporous silicon waveguides, two-dimensional graphite (graphene), and atomically thin black phosphorus. We first present an investigation of the nonlinear response of nanoporous silicon optical waveguides using a novel pump-probe method. A two-frequency heterodyne technique is developed in order to measure the pump-induced transient change in phase and intensity in a single measurement. The experimental data reveal a characteristic material response time and temporally resolved intensity and phase behavior matching a physical model dominated by free-carrier effects that are significantly stronger and faster than those observed in traditional silicon-based waveguides.  These results shed light on the large optical nonlinearity observed in nanoporous silicon and demonstrate a new measurement technique for heterodyne pump-probe spectroscopy. Next we explore the optical properties of low-doped graphene in the terahertz spectral regime, where both intraband and interband effects play a significant role. Probing the graphene at intermediate photon energies enables the investigation of the nonlinear optical properties in the graphene as its electron system is heated by the intense pump pulse. By simultaneously measuring the reflected and transmitted terahertz light, a precise determination of the pump-induced change in absorption can be made. We observe that as the intensity of the terahertz radiation is increased, the optical properties of the graphene change from interband, semiconductor-like absorption, to a more metallic behavior with increased intraband processes. This transition reveals itself in our measurements as an increase in the terahertz transmission through the graphene at low fluence, followed by a decrease in transmission and the onset of a large, photo-induced reflection as fluence is increased.  A hybrid optical-thermodynamic model successfully describes our observations and predicts this transition will persist across mid- and far-infrared frequencies.  This study further demonstrates the important role that reflection plays since the absorption saturation intensity (an important figure of merit for graphene-based saturable absorbers) can be underestimated if only the transmitted light is considered. These findings are expected to contribute to the development of new optoelectronic devices designed to operate in the mid- and far-infrared frequency range.  Lastly we discuss recent work with black phosphorus, a two-dimensional material that has recently attracted interest due to its high mobility and direct, configurable band gap (300 meV to 2eV), depending on the number of atomic layers comprising the sample. In this work we examine the pump-induced change in optical transmission of mechanically exfoliated black phosphorus flakes using a two-color optical pump-probe measurement. The time-resolved data reveal a fast pump-induced transparency accompanied by a slower absorption that we attribute to Pauli blocking and free-carrier absorption, respectively. Polarization studies show that these effects are also highly anisotropic - underscoring the importance of crystal orientation in the design of optical devices based on this material. We conclude our discussion of black phosphorus with a study that employs this material as the active element in a photoconductive detector capable of gigahertz class detection at room temperature for mid-infrared frequencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hand detection on images has important applications on person activities recognition. This thesis focuses on PASCAL Visual Object Classes (VOC) system for hand detection. VOC has become a popular system for object detection, based on twenty common objects, and has been released with a successful deformable parts model in VOC2007. A hand detection on an image is made when the system gets a bounding box which overlaps with at least 50% of any ground truth bounding box for a hand on the image. The initial average precision of this detector is around 0.215 compared with a state-of-art of 0.104; however, color and frequency features for detected bounding boxes contain important information for re-scoring, and the average precision can be improved to 0.218 with these features. Results show that these features help on getting higher precision for low recall, even though the average precision is similar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Present the measurement of a rare Standard Model processes, pp →W±γγ for the leptonic decays of the W±. The measurement is made with 19.4 fb−1 of 8 TeV data collected in 2012 by the CMS experiment. The measured cross section is consistent with the Standard Model prediction and has a significance of 2.9σ. Limits are placed on dimension-8 Effective Field Theories of anomalous Quartic Gauge Couplings. The analysis has particularly sensitivity to the fT,0 coupling and a 95% confidence limit is placed at −35.9 < fT,0/Λ4< 36.7 TeV−4. Studies of the pp →Zγγ process are also presented. The Zγγ signal is in strict agreement with the Standard Model and has a significance of 5.9σ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photosynthesis –the conversion of sunlight to chemical energy –is fundamental for supporting life on our planet. Despite its importance, the physical principles that underpin the primary steps of photosynthesis, from photon absorption to electronic charge separation, remain to be understood in full. Electronic coherence within tightly-packed light-harvesting (LH) units or within individual reaction centers (RCs) has been recognized as an important ingredient for a complete understanding of the excitation energy transfer (EET) dynamics. However, the electronic coherence across units –RC and LH or LH and LH –has been consistently neglected as it does not play a significant role during these relatively slow transfer processes. Here, we turn our attention to the absorption process, which, as we will show, has a much shorter built-in timescale. We demonstrate that the- often overlooked- spatially extended but short-lived excitonic delocalization plays a relevant role in general photosynthetic systems. Most strikingly, we find that absorption intensity is, quite generally, redistributed from LH units to the RC, increasing the number of excitations which can effect charge separation without further transfer steps. A biomemetic nano-system is proposed which is predicted to funnel excitation to the RC-analogue, and hence is the first step towards exploiting these new design principles for efficient artificial light-harvesting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer vision system that has to interact in natural language needs to understand the visual appearance of interactions between objects along with the appearance of objects themselves. Relationships between objects are frequently mentioned in queries of tasks like semantic image retrieval, image captioning, visual question answering and natural language object detection. Hence, it is essential to model context between objects for solving these tasks. In the first part of this thesis, we present a technique for detecting an object mentioned in a natural language query. Specifically, we work with referring expressions which are sentences that identify a particular object instance in an image. In many referring expressions, an object is described in relation to another object using prepositions, comparative adjectives, action verbs etc. Our proposed technique can identify both the referred object and the context object mentioned in such expressions. Context is also useful for incrementally understanding scenes and videos. In the second part of this thesis, we propose techniques for searching for objects in an image and events in a video. Our proposed incremental algorithms use the context from previously explored regions to prioritize the regions to explore next. The advantage of incremental understanding is restricting the amount of computation time and/or resources spent for various detection tasks. Our first proposed technique shows how to learn context in indoor scenes in an implicit manner and use it for searching for objects. The second technique shows how explicitly written context rules of one-on-one basketball can be used to sequentially detect events in a game.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research analyzed the environmental impact on hydrographic microbasin of Parafuso stream at Moju county, Para State, Amazon (Brazil). Data were obtained using digital images, documentary research, questionnaires, semi structured interviews, direct observation and participatory mapping. The results showed that anthropogenic actions and population growth without planning, associated with not planned use of the natural resources, has been caused intense degradation in the physical, biological and anthropogenic environment. The identified springs of the Parafuso stream are difuse, temporary and altered. The parafuso stream network was classified at second order. Most of the environmental impacts identified are adverse character, of great importance, high magnitude and long duration. The physical environment is the most impacted. The major impacting activity is the agriculture, with long term damage in the physical and biological environment, in order of magnitude and importance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The State of Paraíba is one of the most dynamic states of Brazil, strategically located in the northeast, is notable for the excellent potential for integration of different transportation modes forming the states of Rio Grande do Norte, Pernambuco and Alagoas. The dynamic that occurs with port activity causes changes in the space where it is installed. And the elements of this space are always more than suffering direct or indirect influences as the flow in the port is expanded. Therefore, this region became subject to the accidental spillage of oil, because it presents a heavy traffic of ships of various sizes that can run aground or collide with oil causing accidental events. The study of geomorphological and sedimentological compositions of seafloor becomes important as more is known about the relationships between these parameters and associated fauna, and can identify their preferred habitats. The database background, acoustically collected along the proposed study area, is a wealth of information, which were duly examined, cataloged and made available. Such information can serve as an important tool, providing a geomorphological survey of the sedimentary area studied, and come to subsidize, in a flexible, future decision making. With the study area Port of Cabedelo, Paraíba - Brazil, this research aimed to evaluate the influence of the tidal surface and background in modeling the seabed, including the acquisition of information about the location of submerged rocky bodies and the depth of these bodies may turn out to be natural traps for the trapping of oil in case of leaks, and obtain the relationship between types of bed and the hydrodynamic conditions present in the region. In this context, for this study were collected bathymetric data (depth) and physical oceanographic (height of water column, water temperature, intensity and direction of currents, waves and turbidity), meteorological (rainfall, air temperature, humidity, winds and barometric pressure) of the access channel to the Port of Cabedelo / PB and its basin evolution (where the cruise ships dock), and includes tools of remote sensing (Landsat 7 ETM +, 2001), so that images and the results are integrated into Geographic Information Systems and used in the elaboration of measures aimed at environmental protection areas under the influence of this scale facilities, serving as a grant to prepare a contingency plan in case of oil spills in the region. The main findings highlight the techniques of using hydroacoustic data acquisition together bathymetric surveys of high and low frequency. From there, five were prepared in bathymetric pattern of Directorate of Hydrography and Navigation - DHN, with the depth in meters, on a scale of 1:2500 (Channel and Basin Evolution of Access to Port of Cabedelo), where there is a large extent possible beachrocks that hinder the movement of vessels in the port area, which can cause collisions, running aground and leaking oil. From the scatter diagram of the vectors of currents, it can be seen as the tidal stream and undergoes a channeling effect caused by the bidirectional effect of the tide (ebb and flood) in the basin of the Port of Cabedelo evolution in NW-direction SE and the highest speed of the currents occurs at low tide. The characterization weather for the period from 28/02 to 04/07/2010 values was within the expected average for the region of study. The multidisciplinary integration of products (digital maps and remote sensing images), proved to be efficient for the characterization of underwater geomorphological study area, reaching the aim to discriminate and enhance submerged structures, previously not visible in the images