951 resultados para Digital elevation model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT In recent years, geotechnologies as remote and proximal sensing and attributes derived from digital terrain elevation models indicated to be very useful for the description of soil variability. However, these information sources are rarely used together. Therefore, a methodology for assessing and specialize soil classes using the information obtained from remote/proximal sensing, GIS and technical knowledge has been applied and evaluated. Two areas of study, in the State of São Paulo, Brazil, totaling approximately 28.000 ha were used for this work. First, in an area (area 1), conventional pedological mapping was done and from the soil classes found patterns were obtained with the following information: a) spectral information (forms of features and absorption intensity of spectral curves with 350 wavelengths -2,500 nm) of soil samples collected at specific points in the area (according to each soil type); b) obtaining equations for determining chemical and physical properties of the soil from the relationship between the results obtained in the laboratory by the conventional method, the levels of chemical and physical attributes with the spectral data; c) supervised classification of Landsat TM 5 images, in order to detect changes in the size of the soil particles (soil texture); d) relationship between classes relief soils and attributes. Subsequently, the obtained patterns were applied in area 2 obtain pedological classification of soils, but in GIS (ArcGIS). Finally, we developed a conventional pedological mapping in area 2 to which was compared with a digital map, ie the one obtained only with pre certain standards. The proposed methodology had a 79 % accuracy in the first categorical level of Soil Classification System, 60 % accuracy in the second category level and became less useful in the categorical level 3 (37 % accuracy).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is to develop a method to objectively compare the performance of a digital and a screen-film mammography system in terms of image quality. The method takes into account the dynamic range of the image detector, the detection of high and low contrast structures, the visualisation of the images and the observer response. A test object, designed to represent a compressed breast, was constructed from various tissue equivalent materials ranging from purely adipose to purely glandular composition. Different areas within the test object permitted the evaluation of low and high contrast detection, spatial resolution and image noise. All the images (digital and conventional) were captured using a CCD camera to include the visualisation process in the image quality assessment. A mathematical model observer (non-prewhitening matched filter), that calculates the detectability of high and low contrast structures using spatial resolution, noise and contrast, was used to compare the two technologies. Our results show that for a given patient dose, the detection of high and low contrast structures is significantly better for the digital system than for the conventional screen-film system studied. The method of using a test object with a large tissue composition range combined with a camera to compare conventional and digital imaging modalities can be applied to other radiological imaging techniques. In particular it could be used to optimise the process of radiographic reading of soft copy images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mountain ecosystems will likely be affected by global warming during the 21st century, with substantial biodiversity loss predicted by species distribution models (SDMs). Depending on the geographic extent, elevation range and spatial resolution of data used in making these models, different rates of habitat loss have been predicted, with associated risk of species extinction. Few coordinated across-scale comparisons have been made using data of different resolution and geographic extent. Here, we assess whether climate-change induced habitat losses predicted at the European scale (10x10' grid cells) are also predicted from local scale data and modeling (25x25m grid cells) in two regions of the Swiss Alps. We show that local-scale models predict persistence of suitable habitats in up to 100% of species that were predicted by a European-scale model to lose all their suitable habitats in the area. Proportion of habitat loss depends on climate change scenario and study area. We find good agreement between the mismatch in predictions between scales and the fine-grain elevation range within 10x10' cells. The greatest prediction discrepancy for alpine species occurs in the area with the largest nival zone. Our results suggest elevation range as the main driver for the observed prediction discrepancies. Local scale projections may better reflect the possibility for species to track their climatic requirement toward higher elevations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Glutathione (GSH) is the major cellular redox-regulator and antioxidant. Redox-imbalance due to genetically impaired GSH synthesis is among the risk factors for schizophrenia. Here we used a mouse model with chronic GSH deficit induced by knockout (KO) of the key GSH-synthesizing enzyme, glutamate-cysteine ligase modulatory subunit (GCLM).¦METHODS: With high-resolution magnetic resonance spectroscopy at 14.1 T, we determined the neurochemical profile of GCLM-KO, heterozygous, and wild-type mice in anterior cortex throughout development in a longitudinal study design.¦RESULTS: Chronic GSH deficit was accompanied by an elevation of glutamine (Gln), glutamate (Glu), Gln/Glu, N-acetylaspartate, myo-Inositol, lactate, and alanine. Changes were predominantly present at prepubertal ages (postnatal days 20 and 30). Treatment with N-acetylcysteine from gestation on normalized most neurochemical alterations to wild-type level.¦CONCLUSIONS: Changes observed in GCLM-KO anterior cortex, notably the increase in Gln, Glu, and Gln/Glu, were similar to those reported in early schizophrenia, emphasizing the link between redox imbalance and the disease and validating the model. The data also highlight the prepubertal period as a sensitive time for redox-related neurochemical changes and demonstrate beneficial effects of early N-acetylcysteine treatment. Moreover, the data demonstrate the translational value of magnetic resonance spectroscopy to study brain disease in preclinical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have constructed a forward modelling code in Matlab, capable of handling several commonly used electrical and electromagnetic methods in a 1D environment. We review the implemented electromagnetic field equations for grounded wires, frequency and transient soundings and present new solutions in the case of a non-magnetic first layer. The CR1Dmod code evaluates the Hankel transforms occurring in the field equations using either the Fast Hankel Transform based on digital filter theory, or a numerical integration scheme applied between the zeros of the Bessel function. A graphical user interface allows easy construction of 1D models and control of the parameters. Modelling results are in agreement with other authors, but the time of computation is less efficient than other available codes. Nevertheless, the CR1Dmod routine handles complex resistivities and offers solutions based on the full EM-equations as well as the quasi-static approximation. Thus, modelling of effects based on changes in the magnetic permeability and the permittivity is also possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is based on the hypothesis that the use of technology to support learning is not related to whether a student belongs to the Net Generation, but that it is mainly influenced by the teaching model. The study compares behaviour and preferences towards ICT use in two groups of university students: face-to-face students and online students. A questionnaire was applied to asample of students from five universities with different characteristics (one offers online education and four offer face-to-face education with LMS teaching support).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analysed the relationship between changes in land cover patterns and the Eurasian otter occurrence over the course of about 20 years (1985-2006) using multi-temporal Species Distribution Models (SDMs). The study area includes five river catchments covering most of the otter's Italian range. Land cover and topographic data were used as proxies of the ecological requirements of the otter within a 300-m buffer around river courses. We used species presence, pseudo-absence data, and environmental predictors to build past (1985) and current (2006) SDMs by applying an ensemble procedure through the BIOMOD modelling package. The performance of each model was evaluated by measuring the area under the curve (AUC) of the receiver-operating characteristic (ROC). Multi-temporal analyses of species distribution and land cover maps were performed by comparing the maps produced for 1985 and 2006. The ensemble procedure provided a good overall modelling accuracy, revealing that elevation and slope affected the otter's distribution in the past; in contrast, land cover predictors, such as cultivations and forests, were more important in the present period. During the transition period, 20.5% of the area became suitable, with 76% of the new otter presence data being located in these newly available areas. The multi-temporal analysis suggested that the quality of otter habitat improved in the last 20 years owing to the expansion of forests and to the reduction of cultivated fields in riparian belts. The evidence presented here stresses the great potential of riverine habitat restoration and environmental management for the future expansion of the otter in Italy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a probabilistic approach to model the problem of power supply voltage fluctuations. Error probability calculations are shown for some 90-nm technology digital circuits.The analysis here considered gives the timing violation error probability as a new design quality factor in front of conventional techniques that assume the full perfection of the circuit. The evaluation of the error bound can be useful for new design paradigms where retry and self-recoveringtechniques are being applied to the design of high performance processors. The method here described allows to evaluate the performance of these techniques by means of calculating the expected error probability in terms of power supply distribution quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tässä diplomityössä tutkitaan tekniikoita, joillavesileima lisätään spektrikuvaan, ja menetelmiä, joilla vesileimat tunnistetaanja havaitaan spektrikuvista. PCA (Principal Component Analysis) -algoritmia käyttäen alkuperäisten kuvien spektriulottuvuutta vähennettiin. Vesileiman lisääminen spektrikuvaan suoritettiin muunnosavaruudessa. Ehdotetun mallin mukaisesti muunnosavaruuden komponentti korvattiin vesileiman ja toisen muunnosavaruuden komponentin lineaarikombinaatiolla. Lisäyksessä käytettävää parametrijoukkoa tutkittiin. Vesileimattujen kuvien laatu mitattiin ja analysoitiin. Suositukset vesileiman lisäykseen esitettiin. Useita menetelmiä käytettiin vesileimojen tunnistamiseen ja tunnistamisen tulokset analysoitiin. Vesileimojen kyky sietää erilaisia hyökkäyksiä tarkistettiin. Diplomityössä suoritettiin joukko havaitsemis-kokeita ottamalla huomioon vesileiman lisäyksessä käytetyt parametrit. ICA (Independent Component Analysis) -menetelmää pidetään yhtenä mahdollisena vaihtoehtona vesileiman havaitsemisessa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este proyecto consiste en diseñar el algoritmo de control de un autogiro no tripulado. Su aplicación principal es llevar a cabo tareas rutinarias o peligrosas para el piloto como, por ejemplo, extinción de incendios, evaluación de riesgo químico o vigilancia de lugares de acceso restringido. Se realiza un estudio del movimiento del vehículo para obtener su modelo dinámico. A partir de las ecuaciones que describen su movimiento, se realiza una simulación numérica del vehículo. Se incorpora el controlador diseñado y se evalúa su funcionamiento. Finalmente, se implementa el sistema en un microcontrolador.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to improve the management of copyright in the Internet, known as Digital Rights Management, there is the need for a shared language for copyright representation. Current approaches are based on purely syntactic solutions, i.e. a grammar that defines a rights expression language. These languages are difficult to put into practise due to the lack of explicit semantics that facilitate its implementation. Moreover, they are simple from the legal point of view because they are intended just to model the usage licenses granted by content providers to end-users. Thus, they ignore the copyright framework that lies behind and the whole value chain from creators to end-users. Our proposal is to use a semantic approach based on semantic web ontologies. We detail the development of a copyright ontology in order to put this approach into practice. It models the copyright core concepts for creation, rights and the basic kinds of actions that operate on content. Altogether, it allows building a copyright framework for the complete value chain. The set of actions operating on content are our smaller building blocks in order to cope with the complexity of copyright value chains and statements and, at the same time, guarantee a high level of interoperability and evolvability. The resulting copyright modelling framework is flexible and complete enough to model many copyright scenarios, not just those related to the economic exploitation of content. The ontology also includes moral rights, so it is possible to model this kind of situations as it is shown in the included example model for a withdrawal scenario. Finally, the ontology design and the selection of tools result in a straightforward implementation. Description Logic reasoners are used for license checking and retrieval. Rights are modelled as classes of actions, action patterns are modelled also as classes and the same is done for concrete actions. Then, to check if some right or license grants an action is reduced to check for class subsumption, which is a direct functionality of these reasoners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, an important debate has arisen about the characteristics of today"s students due to their intensive experience as users of ICT. The main belief is that frequent use of technologies in everyday life implies that competent users are able to transfer their digital skills to learning activities. However, empirical studies developed in different countries reveal similar results suggesting that the"digital native" label does not provide evidence of a better use of technology to support learning. The debate has to go beyond the characteristics of the new generation and focus on the implications of being a learner in a digitalised world. This paper is based on the hypothesis that the use of technology to support learning is not related to whether a student belongs to the Net Generation, but that it is mainly influenced by the teaching model. The study compares behaviour and preferences towards ICT use in two groups of university students: face-to-face students and online students. A questionnaire was applied to a sample of students from five universities with different characteristics (one offers online education and four offer face-to-face education with LMS teaching support). Findings suggest that although access to and use of ICT is widespread, the influence of teaching methodology is very decisive. For academic purposes, students seem to respond to the requirements of their courses, programmes, and universities. There is a clear relationship between students" perception of usefulness regarding certain ICT resources and their teachers" suggested uses of technologies. The most highly rated technologies correspond with those proposed by teachers. The study shows that the educational model (face-to-face or online) has a stronger influence on students" perception of usefulness regarding ICT support for learning than the fact of being a digital native.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: The aims of the study are to compare the outcome with and without major bleeding and to identify the independent correlates of major bleeding complications and mortality in patients described in the ATOLL study. METHODS: The ATOLL study included 910 patients randomly assigned to either 0.5 mg/kg intravenous enoxaparin or unfractionated heparin before primary percutaneous coronary intervention. Incidence of major bleeding and ischemic end points was assessed at 1 month, and mortality, at 1 and 6 months. Patients with and without major bleeding complication were compared. A multivariate model of bleeding complications at 1 month and mortality at 6 months was realized. Intention-to-treat and per-protocol analyses were performed. RESULTS: The most frequent bleeding site appears to be the gastrointestinal tract. Age >75 years, cardiac arrest, and the use of insulin or >1 heparin emerged as independent correlates of major bleeding at 1 month. Patients presenting with major bleeding had significantly higher rates of adverse ischemic complications. Mortality at 6 months was higher in bleeders. Major bleeding was found to be one of the independent correlates of 6-month mortality. The addition or mixing of several anticoagulant drugs was an independent factor of major bleeding despite the predominant use of radial access. CONCLUSIONS: This study shows that major bleeding is independently associated with poor outcome, increasing ischemic events, and mortality in primary percutaneous coronary intervention performed mostly with radial access.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Lymphedema is an underdiagnosed pathology which in industrialized countries mainly affects cancer patients that underwent lymph node dissection and/or radiation. Currently no effective therapy is available so that patients' life quality is compromised by swellings of the concerned body region. This unfortunate condition is associated with body imbalance and subsequent osteochondral deformations and impaired function as well as with an increased risk of potentially life threatening soft tissue infections. METHODS: The effects of PRP and ASC on angiogenesis (anti-CD31 staining), microcirculation (Laser Doppler Imaging), lymphangiogenesis (anti-LYVE1 staining), microvascular architecture (corrosion casting) and wound healing (digital planimetry) are studied in a murine tail lymphedema model. RESULTS: Wounds treated by PRP and ASC healed faster and showed a significantly increased epithelialization mainly from the proximal wound margin. The application of PRP induced a significantly increased lymphangiogenesis while the application of ASC did not induce any significant change in this regard. CONCLUSIONS: PRP and ASC affect lymphangiogenesis and lymphedema development and might represent a promising approach to improve regeneration of lymphatic vessels, restore disrupted lymphatic circulation and treat or prevent lymphedema alone or in combination with currently available lymphedema therapies.