897 resultados para Mesh generation from image data
Resumo:
Obtaining wind vectors over the ocean is important for weather forecasting and ocean modelling. Several satellite systems used operationally by meteorological agencies utilise scatterometers to infer wind vectors over the oceans. In this paper we present the results of using novel neural network based techniques to estimate wind vectors from such data. The problem is partitioned into estimating wind speed and wind direction. Wind speed is modelled using a multi-layer perceptron (MLP) and a sum of squares error function. Wind direction is a periodic variable and a multi-valued function for a given set of inputs; a conventional MLP fails at this task, and so we model the full periodic probability density of direction conditioned on the satellite derived inputs using a Mixture Density Network (MDN) with periodic kernel functions. A committee of the resulting MDNs is shown to improve the results.
Resumo:
Obtaining wind vectors over the ocean is important for weather forecasting and ocean modelling. Several satellite systems used operationally by meteorological agencies utilise scatterometers to infer wind vectors over the oceans. In this paper we present the results of using novel neural network based techniques to estimate wind vectors from such data. The problem is partitioned into estimating wind speed and wind direction. Wind speed is modelled using a multi-layer perceptron (MLP) and a sum of squares error function. Wind direction is a periodic variable and a multi-valued function for a given set of inputs; a conventional MLP fails at this task, and so we model the full periodic probability density of direction conditioned on the satellite derived inputs using a Mixture Density Network (MDN) with periodic kernel functions. A committee of the resulting MDNs is shown to improve the results.
Resumo:
The analysis and prediction of the dynamic behaviour of s7ructural components plays an important role in modern engineering design. :n this work, the so-called "mixed" finite element models based on Reissnen's variational principle are applied to the solution of free and forced vibration problems, for beam and :late structures. The mixed beam models are obtained by using elements of various shape functions ranging from simple linear to complex cubic and quadratic functions. The elements were in general capable of predicting the natural frequencies and dynamic responses with good accuracy. An isoparametric quadrilateral element with 8-nodes was developed for application to thin plate problems. The element has 32 degrees of freedom (one deflection, two bending and one twisting moment per node) which is suitable for discretization of plates with arbitrary geometry. A linear isoparametric element and two non-conforming displacement elements (4-node and 8-node quadrilateral) were extended to the solution of dynamic problems. An auto-mesh generation program was used to facilitate the preparation of input data required by the 8-node quadrilateral elements of mixed and displacement type. Numerical examples were solved using both the mixed beam and plate elements for predicting a structure's natural frequencies and dynamic response to a variety of forcing functions. The solutions were compared with the available analytical and displacement model solutions. The mixed elements developed have been found to have significant advantages over the conventional displacement elements in the solution of plate type problems. A dramatic saving in computational time is possible without any loss in solution accuracy. With beam type problems, there appears to be no significant advantages in using mixed models.
Resumo:
Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.
Resumo:
Satellite information, in combination with conventional point source measurements, can be a valuable source of information. This thesis is devoted to the spatial estimation of areal rainfall over a region using both the measurements from a dense and sparse network of rain-gauges and images from the meteorological satellites. A primary concern is to study the effects of such satellite assisted rainfall estimates on the performance of rainfall-runoff models. Low-cost image processing systems and peripherals are used to process and manipulate the data. Both secondary as well as primary satellite images were used for analysis. The secondary data was obtained from the in-house satellite receiver and the primary data was obtained from an outside source. Ground truth data was obtained from the local Water Authority. A number of algorithms are presented that combine the satellite and conventional data sources to produce areal rainfall estimates and the results are compared with some of the more traditional methodologies. The results indicate that the satellite cloud information is valuable in the assessment of the spatial distribution of areal rainfall, for both half-hourly as well as daily estimates of rainfall. It is also demonstrated how the performance of the simple multiple regression rainfall-runoff model is improved when satellite cloud information is used as a separate input in addition to rainfall estimates from conventional means. The use of low-cost equipment, from image processing systems to satellite imagery, makes it possible for developing countries to introduce such systems in areas where the benefits are greatest.
Resumo:
This thesis investigates the cost of electricity generation using bio-oil produced by the fast pyrolysis of UK energy crops. The study covers cost from the farm to the generator’s terminals. The use of short rotation coppice willow and miscanthus as feedstocks was investigated. All costs and performance data have been taken from published papers, reports or web sites. Generation technologies are compared at scales where they have proved economic burning other fuels, rather than at a given size. A pyrolysis yield model was developed for a bubbling fluidised bed fast pyrolysis reactor from published data to predict bio-oil yields and pyrolysis plant energy demands. Generation using diesel engines, gas turbines in open and combined cycle (CCGT) operation and steam cycle plants was considered. The use of bio-oil storage to allow the pyrolysis and generation plants to operate independently of each other was investigated. The option of using diesel generators and open cycle gas turbines for combined heat and power was examined. The possible cost reductions that could be expected through learning if the technology is widely implemented were considered. It was found that none of the systems analysed would be viable without subsidy, but with the current Renewable Obligation Scheme CCGT plants in the 200 to 350 MWe range, super-critical coal fired boilers co-fired with bio-oil, and groups of diesel engine based CHP schemes supplied by a central pyrolysis plant would be viable. It was found that the cost would reduce with implementation and the planting of more energy crops but some subsidy would still be needed to make the plants viable.
Resumo:
Purpose-To develop a non-invasive method for quantification of blood and pigment distributions across the posterior pole of the fundus from multispectral images using a computer-generated reflectance model of the fundus. Methods - A computer model was developed to simulate light interaction with the fundus at different wavelengths. The distribution of macular pigment (MP) and retinal haemoglobins in the fundus was obtained by comparing the model predictions with multispectral image data at each pixel. Fundus images were acquired from 16 healthy subjects from various ethnic backgrounds and parametric maps showing the distribution of MP and of retinal haemoglobins throughout the posterior pole were computed. Results - The relative distributions of MP and retinal haemoglobins in the subjects were successfully derived from multispectral images acquired at wavelengths 507, 525, 552, 585, 596, and 611?nm, providing certain conditions were met and eye movement between exposures was minimal. Recovery of other fundus pigments was not feasible and further development of the imaging technique and refinement of the software are necessary to understand the full potential of multispectral retinal image analysis. Conclusion - The distributions of MP and retinal haemoglobins obtained in this preliminary investigation are in good agreement with published data on normal subjects. The ongoing development of the imaging system should allow for absolute parameter values to be computed. A further study will investigate subjects with known pathologies to determine the effectiveness of the method as a screening and diagnostic tool.
Resumo:
Natural language understanding (NLU) aims to map sentences to their semantic mean representations. Statistical approaches to NLU normally require fully-annotated training data where each sentence is paired with its word-level semantic annotations. In this paper, we propose a novel learning framework which trains the Hidden Markov Support Vector Machines (HM-SVMs) without the use of expensive fully-annotated data. In particular, our learning approach takes as input a training set of sentences labeled with abstract semantic annotations encoding underlying embedded structural relations and automatically induces derivation rules that map sentences to their semantic meaning representations. The proposed approach has been tested on the DARPA Communicator Data and achieved 93.18% in F-measure, which outperforms the previously proposed approaches of training the hidden vector state model or conditional random fields from unaligned data, with a relative error reduction rate of 43.3% and 10.6% being achieved.
Resumo:
An iterative procedure for determining temperature fields from Cauchy data given on a part of the boundary is presented. At each iteration step, a series of mixed well-posed boundary value problems are solved for the heat operator and its adjoint. A convergence proof of this method in a weighted L2-space is included, as well as a stopping criteria for the case of noisy data. Moreover, a solvability result in a weighted Sobolev space for a parabolic initial boundary value problem of second order with mixed boundary conditions is presented. Regularity of the solution is proved. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
As the volume of image data and the need of using it in various applications is growing significantly in the last days it brings a necessity of retrieval efficiency and effectiveness. Unfortunately, existing indexing methods are not applicable to a wide range of problem-oriented fields due to their operating time limitations and strong dependency on the traditional descriptors extracted from the image. To meet higher requirements, a novel distance-based indexing method for region-based image retrieval has been proposed and investigated. The method creates premises for considering embedded partitions of images to carry out the search with different refinement or roughening level and so to seek the image meaningful content.
Resumo:
Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^
Resumo:
Context: Due to a unique combination of factors, outdoor athletes in the Southeastern United States are at high risk of lightning deaths and injuries. Lightning detection methods are available to minimize lightning strike victims. Objective: Becoming aware of the risk factors that predispose athletes to lightning strikes and determining the most reliable detection method against hazardous weather will enable Certified Athletic Trainers to develop protocols that protect athletes from injury. Data Sources: A comprehensive literature review of Medline and Pubmed using key words: lightning, lightning risk factors, lightning safety, lightning detection, and athletic trainers and lightning was completed. Data Synthesis: Factors predisposing athletes to lighting death or injury include: time of year, time of day, the athlete’s age, geographical location, physical location, sex, perspiration level, and lack of education and preparedness by athletes and staff. Although handheld lightning detectors have become widely accessible to detect lightning strikes, their performance has not been independently or objectively confirmed. There is evidence that these detectors inaccurately detect strike locations by recording false strikes and not recording actual strikes. Conclusions: Lightning education and preparation are two factors that can be controlled. Measures need to be taken by Certified Athletic Trainers to ensure the safety of athletes during outdoor athletics. It is critical for athletic trainers and supervising staff members to become fully aware of the risks of lightning strikes in order to most effectively protect everyone under their supervision. Even though lightning detectors have been manufactured in an attempt to minimize death and injuries due to lightning strikes, none of the detectors have been proven to be 100% effective. Educating coaches, athletes, and parents on the risks of lightning and the detection methods available, while implementing an emergency action plan for lightning safety, is crucial to ensure the well being of the student-athlete population.
Resumo:
Our understanding of employee attitudes and their impact on business outcomes has been further complicated in recent years by the newest cohort of service workers. Known as Generation Y (Gen Y), they appear to approach employment in a manner different to that of their predecessors. A review of the academic literature reveals little empirical evidence to support an appropriate understanding of the impact of such difference. This paper provides an overview of a large-scale study into generational differences in employee attitudes and reports on the preliminary data analysis of a survey of over 900 hospitality employees. The most important initial finding from the data analysis is that, on the whole, Gen Y employees have lower scores on those constructs that an organization should be attempting to maximize. Non-Gen Y employees are more satisfied with their jobs, more engaged and more affectively committed to the organization they work for than their Gen Y counterparts, amongst a range of other important constructs. Conversely, Gen Y employees display higher scores onthe constructs that an organization would want to minimize in its staff. Gen Y employees are more likely to be planning to quit their jobs, are more likely to perform poorly if their co-workers are doing so, and are also more likely to switch jobs for no particular reason. The discussion covers implications for management as well as directions for future research.
Resumo:
Immigrants from Jamaica represent the largest number of migrants to the United States from the English speaking Caribbean. Research indicates that of all Caribbean immigrants they are most likely to retain the ethnic identity of their home country for the longest period of time. This dissertation explored the nature of ethnic identity and sought to determine its impact upon the additional variables of self-esteem and academic factors. A secondary analysis was carried out using data collected in the Spring of 1992 by Portes and Rumbaut on the children of immigrants attending the eighth and ninth grades in local schools in San Diego and southern Florida. A sample of 151 second-generation Jamaican immigrants was selected from the data set. ^ Six hypotheses yielded mixed results. Both parents who have a Jamaican ethnic identity present in the household are the best predictor Jamaican youth who retain a Jamaican ethnic identity. It was expected that ethnic identity would be a predictor of positive academic factors. The study showed that ethnic identity was not associated with one of the academic factors which were examined: help given with homework. ^ Neither family economic status nor parents' level of education played a significant role in the retention of Jamaican identity. Other findings were that there was no mean difference in the self-esteem scores of respondents who had similar ethnic identities to their parents and those who did not. There was also no difference found in the academic factors of either group. The study also showed that there was a small correlation between parent-child conflict and self-esteem. Specifically, the study found that the higher the conflict between youth and their parents, the lower the self-esteem of the youth. Finally it found that time lived in the U.S. was the best predictor of a higher GPA and it was also related to lower self-esteem. ^ Surprisingly, the study found that the relationship between ethnic identity and SES was the opposite of what was expected in that it found that SES was higher when there was no Jamaican identity. ^
Resumo:
This research aims at studying the use of greeting cards, here understood as a literacy practice widely used in American society of the United States. In American culture, these cards become sources of information and memory about people‟s cycles of life, their experiences and their bonds of sociability enabled by means of the senses that the image and the word comprise. The main purpose of this work is to describe how this literacy practice occurs in American society. Theoretically, this research is based on studies of literacy (BARTON, HAMILTON, 1998; BAYHAM, 1995; HAMILTON, 2000; STREET, 1981, 1984, 1985, 1993, 2003), the contributions of social semiotics, associated with systemic-functional grammar (HALLIDAY; HASAN 1978, 1985, HALLIDAY, 1994, HALLIDAY; MATTHIESSEN, 2004), and the grammar of visual design (KRESS; LEITE-GARCIA, VAN LEEUWEN, 1997, 2004, 2006; KRESS; MATTHIESSEN, 2004). Methodologically, it is a study that falls within the qualitative paradigm of interpretative character, which adopts ethnographic tools in data generation. From this perspective, it makes use of “looking and asking” techniques (ERICKSON, 1986, p. 119), complemented by the technique of "registering", proposed by Paz (2008). The corpus comprises 104 printed cards, provided by users of this cultural artifact, from which we selected 24, and 11 e-cards, extracted from the internet, as well as verbalizations obtained by applying a questionnaire prepared with open questions asked in order to gather information about the perceptions and actions of these cards users with respect to this literacy practice. Data analysis reveals cultural, economic and social aspects of this practice and the belief that literacy practice of using printed greeting cards, despite the existence of virtual alternatives, is still very fruitful in American society. The study also allows users to comprehend that the cardholders position themselves and construct identities that are expressed in verbal and visual interaction in order to achieve the desired effect. As a result, it is understood that greeting cards are not unintentional, but loaded with ideology and power relations, among other aspects that are constitutive of them.