873 resultados para Data compression (Electronic computers)
Resumo:
The Tyne Digital Library (TDL) provides access to scholarly materials (e.g. papers, book chapters, bibliographic reference lists), databases of hydrological and physical information, maps of key physiographic and environmental data, and electronic journal articles, for students undertaking GEOG3023 River Basin Management. In addition, the TDL utilises technological innovations that enhance services for accessing this information.
Resumo:
The Tyne Digital Library (TDL) provides access to scholarly materials (e.g. papers, book chapters, bibliographic reference lists), databases of hydrological and physical information, maps of key physiographic and environmental data, and electronic journal articles, for students undertaking GEOG3023 River Basin Management. In addition, the TDL utilises technological innovations that enhance services for accessing this information.
Resumo:
"Student’s Watcher” is a small Web application which wants to show in a visual, simple and fast way, the evolution of the students. The main project table displays such things as marks and comments about students. We can add a comment for each mark to explain why this mark. The objective is to be able to know if some student has a problem, how is going his year, marks in other courses, or even, to know if he has a bad week in a different subjects. We can see the evolution of students in past years to do an objective comparison. It also allows inserting global comments of student, we have a list of these, and all professors can add new ones, where we can see more general valuations. “Student’s Watcher” was begun in ASP.net, but finally my project would be developed in PHP, HTML and CSS. This project wants to be a comparison between two of most important languages used nowadays, ASPX and PHP
Resumo:
The resolution of remotely sensed data is becoming increasingly fine, and there are now many sources of data with a pixel size of 1 m x 1 m. This produces huge amounts of data that have to be stored, processed and transmitted. For environmental applications this resolution possibly provides far more data than are needed: data overload. This poses the question: how much is too much? We have explored two resolutions of data-20 in pixel SPOT data and I in pixel Computerized Airborne Multispectral Imaging System (CAMIS) data from Fort A. P. Hill (Virginia, USA), using the variogram of geostatistics. For both we used the normalized difference vegetation index (NDVI). Three scales of spatial variation were identified in both the SPOT and 1 in data: there was some overlap at the intermediate spatial scales of about 150 in and of 500 m-600 in. We subsampled the I in data and scales of variation of about 30 in and of 300 in were identified consistently until the separation between pixel centroids was 15 in (or 1 in 225pixels). At this stage, spatial scales of about 100m and 600m were described, which suggested that only now was there a real difference in the amount of spatial information available from an environmental perspective. These latter were similar spatial scales to those identified from the SPOT image. We have also analysed I in CAMIS data from Fort Story (Virginia, USA) for comparison and the outcome is similar.:From these analyses it seems that a pixel size of 20m is adequate for many environmental applications, and that if more detail is required the higher resolution data could be sub-sampled to a 10m separation between pixel centroids without any serious loss of information. This reduces significantly the amount of data that needs to be stored, transmitted and analysed and has important implications for data compression.
Resumo:
Empirical orthogonal function (EOF) analysis is a powerful tool for data compression and dimensionality reduction used broadly in meteorology and oceanography. Often in the literature, EOF modes are interpreted individually, independent of other modes. In fact, it can be shown that no such attribution can generally be made. This review demonstrates that in general individual EOF modes (i) will not correspond to individual dynamical modes, (ii) will not correspond to individual kinematic degrees of freedom, (iii) will not be statistically independent of other EOF modes, and (iv) will be strongly influenced by the nonlocal requirement that modes maximize variance over the entire domain. The goal of this review is not to argue against the use of EOF analysis in meteorology and oceanography; rather, it is to demonstrate the care that must be taken in the interpretation of individual modes in order to distinguish the medium from the message.
Resumo:
Objectives: To clarify the role of growth monitoring in primary school children, including obesity, and to examine issues that might impact on the effectiveness and cost-effectiveness of such programmes. Data sources: Electronic databases were searched up to July 2005. Experts in the field were also consulted. Review methods: Data extraction and quality assessment were performed on studies meeting the review's inclusion criteria. The performance of growth monitoring to detect disorders of stature and obesity was evaluated against National Screening Committee (NSC) criteria. Results: In the 31 studies that were included in the review, there were no controlled trials of the impact of growth monitoring and no studies of the diagnostic accuracy of different methods for growth monitoring. Analysis of the studies that presented a 'diagnostic yield' of growth monitoring suggested that one-off screening might identify between 1: 545 and 1: 1793 new cases of potentially treatable conditions. Economic modelling suggested that growth monitoring is associated with health improvements [ incremental cost per quality-adjusted life-year (QALY) of pound 9500] and indicated that monitoring was cost-effective 100% of the time over the given probability distributions for a willingness to pay threshold of pound 30,000 per QALY. Studies of obesity focused on the performance of body mass index against measures of body fat. A number of issues relating to human resources required for growth monitoring were identified, but data on attitudes to growth monitoring were extremely sparse. Preliminary findings from economic modelling suggested that primary prevention may be the most cost-effective approach to obesity management, but the model incorporated a great deal of uncertainty. Conclusions: This review has indicated the potential utility and cost-effectiveness of growth monitoring in terms of increased detection of stature-related disorders. It has also pointed strongly to the need for further research. Growth monitoring does not currently meet all NSC criteria. However, it is questionable whether some of these criteria can be meaningfully applied to growth monitoring given that short stature is not a disease in itself, but is used as a marker for a range of pathologies and as an indicator of general health status. Identification of effective interventions for the treatment of obesity is likely to be considered a prerequisite to any move from monitoring to a screening programme designed to identify individual overweight and obese children. Similarly, further long-term studies of the predictors of obesity-related co-morbidities in adulthood are warranted. A cluster randomised trial comparing growth monitoring strategies with no growth monitoring in the general population would most reliably determine the clinical effectiveness of growth monitoring. Studies of diagnostic accuracy, alongside evidence of effective treatment strategies, could provide an alternative approach. In this context, careful consideration would need to be given to target conditions and intervention thresholds. Diagnostic accuracy studies would require long-term follow-up of both short and normal children to determine sensitivity and specificity of growth monitoring.
Resumo:
Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms
Resumo:
This work reports on the optical properties of Cr3+ ions in the pseudoternary system InF3-GdF3-GaF3. Linear properties, investigated through absorption and emission spectra, provide information on the crystal field, the frequency, and number of phonons emitted during the absorption to the 4T2 band and the emission to the 4A2 ground state, and the Fano antiresonance line shape in the vicinity of the 4A2→2E transition. A study of the nonlinear refractive index as a function of the wavelength, carried out with the Z-scan technique, provides spectroscopic data about electronic transitions starting from the excited state.
Resumo:
Structural health monitoring (SHM) is related to the ability of monitoring the state and deciding the level of damage or deterioration within aerospace, civil and mechanical systems. In this sense, this paper deals with the application of a two-step auto-regressive and auto-regressive with exogenous inputs (AR-ARX) model for linear prediction of damage diagnosis in structural systems. This damage detection algorithm is based on the. monitoring of residual error as damage-sensitive indexes, obtained through vibration response measurements. In complex structures there are. many positions under observation and a large amount of data to be handed, making difficult the visualization of the signals. This paper also investigates data compression by using principal component analysis. In order to establish a threshold value, a fuzzy c-means clustering is taken to quantify the damage-sensitive index in an unsupervised learning mode. Tests are made in a benchmark problem, as proposed by IASC-ASCE with different damage patterns. The diagnosis that was obtained showed high correlation with the actual integrity state of the structure. Copyright © 2007 by ABCM.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)