863 resultados para Gaussian Plume model for multiple sources foe Cochin
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Objective: To quantify the burden of disease and injury for the Aboriginal and non-Aboriginal populations in the Northern Territory. Design and setting: Analysis of Northern Territory data for 1 January 1994 to 30 December 1998 from multiple sources. Main outcome measures: Disability-adjusted life-years (DALYs), by age, sex, cause and Aboriginality. Results: Cardiovascular disease was the leading contributor (14.9%) to the total burden of disease and injury in the NT, followed by mental disorders (14.5%) and malignant neoplasms (11.2%). There was also a substantial contribution from unintentional injury (10.4%) and intentional injury (4.9%). Overall, the NT Aboriginal population had a rate of burden of disease 2.5 times higher than the non-Aboriginal population; in the 35-54-year age group their DALY rate was 4.1 times higher. The leading causes of disease burden were cardiovascular disease for both Aboriginal men (19.1%) and women (15.7%) and mental disorders for both non-Aboriginal men (16.7%) and women (22.3%). Conclusions: A comprehensive assessment of fatal and non-fatal conditions is important in describing differentials in health status of the NT population. Our study provides comparative data to identify health priorities and facilitate a more equitable distribution of health funding.
Resumo:
Automatic signature verification is a well-established and an active area of research with numerous applications such as bank check verification, ATM access, etc. This paper proposes a novel approach to the problem of automatic off-line signature verification and forgery detection. The proposed approach is based on fuzzy modeling that employs the Takagi-Sugeno (TS) model. Signature verification and forgery detection are carried out using angle features extracted from box approach. Each feature corresponds to a fuzzy set. The features are fuzzified by an exponential membership function involved in the TS model, which is modified to include structural parameters. The structural parameters are devised to take account of possible variations due to handwriting styles and to reflect moods. The membership functions constitute weights in the TS model. The optimization of the output of the TS model with respect to the structural parameters yields the solution for the parameters. We have also derived two TS models by considering a rule for each input feature in the first formulation (Multiple rules) and by considering a single rule for all input features in the second formulation. In this work, we have found that TS model with multiple rules is better than TS model with single rule for detecting three types of forgeries; random, skilled and unskilled from a large database of sample signatures in addition to verifying genuine signatures. We have also devised three approaches, viz., an innovative approach and two intuitive approaches using the TS model with multiple rules for improved performance. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Resumo:
Multilevel theories integrate individual-level processes with those occurring at the level of the firm and above to generate richer and more complete explanations of IB phenomena than the traditional specification of IB relationships as single-level and parsimonious allows. Case study methods permit the timely collection of multiple sources of data, in context, from multiple individuals and multiple organizational units. Further, because the definitions for each level emerge from case data rather than being imposed a priori, case analysis promotes an understanding of deeper structures and cross-level processes. This paper considers the example of sport as an internationalized service to illustrate how the case method might be used to illuminate the multilevel phenomena of knowledge.
Resumo:
Among the Solar System’s bodies, Moon, Mercury and Mars are at present, or have been in the recent years, object of space missions aimed, among other topics, also at improving our knowledge about surface composition. Between the techniques to detect planet’s mineralogical composition, both from remote and close range platforms, visible and near-infrared reflectance (VNIR) spectroscopy is a powerful tool, because crystal field absorption bands are related to particular transitional metals in well-defined crystal structures, e.g., Fe2+ in M1 and M2 sites of olivine or pyroxene (Burns, 1993). Thanks to the improvements in the spectrometers onboard the recent missions, a more detailed interpretation of the planetary surfaces can now be delineated. However, quantitative interpretation of planetary surface mineralogy could not always be a simple task. In fact, several factors such as the mineral chemistry, the presence of different minerals that absorb in a narrow spectral range, the regolith with a variable particle size range, the space weathering, the atmosphere composition etc., act in unpredictable ways on the reflectance spectra on a planetary surface (Serventi et al., 2014). One method for the interpretation of reflectance spectra of unknown materials involves the study of a number of spectra acquired in the laboratory under different conditions, such as different mineral abundances or different particle sizes, in order to derive empirical trends. This is the methodology that has been followed in this PhD thesis: the single factors previously listed have been analyzed, creating, in the laboratory, a set of terrestrial analogues with well-defined composition and size. The aim of this work is to provide new tools and criteria to improve the knowledge of the composition of planetary surfaces. In particular, mixtures composed with different content and chemistry of plagioclase and mafic minerals have been spectroscopically analyzed at different particle sizes and with different mineral relative percentages. The reflectance spectra of each mixture have been analyzed both qualitatively (using the software ORIGIN®) and quantitatively applying the Modified Gaussian Model (MGM, Sunshine et al., 1990) algorithm. In particular, the spectral parameter variations of each absorption band have been evaluated versus the volumetric FeO% content in the PL phase and versus the PL modal abundance. This delineated calibration curves of composition vs. spectral parameters and allow implementation of spectral libraries. Furthermore, the trends derived from terrestrial analogues here analyzed and from analogues in the literature have been applied for the interpretation of hyperspectral images of both plagioclase-rich (Moon) and plagioclase-poor (Mars) bodies.
Resumo:
Esta pesquisa propõe uma reflexão a respeito dos processos de recepção de produtos culturais por parte do telespectador infantil. A problematização está em investigar, valendo-se dos referenciais teóricos das mediações comunicativas da cultura e do enfoque integral da audiência, como as crianças com idade entre 7 e 12 anos interagem com as narrativas audiovisuais, especialmente com a série de animação televisiva, para desta perspectiva compreender como se dão a apropriação e a produção de sentidos no cotidiano, tomando-se por base a interpretação do desenho animado Doug Funnie. O estudo emprega como metodologia a revisão de literatura combinada ao grupo de discussão norteado pelo modelo teórico-metodológico da mediação múltipla formulado por Guillermo Orozco Gómez, com base no paradigma das mediações de Jesús Martín-Barbero. A pesquisa de recepção, realizada em ambiente escolar, constatou que a comunicação midiática é de natureza dialógica e implica em reconhecimento e projeção do interlocutor no universo da ficção, assim, a produção de sentidos em relação ao desenho animado não está contida no audiovisual, mas no contexto sociocultural no qual interlocutores relacionam-se entre si e com os meios. E desta interação emergem a compreensão e a recriação dos produtos culturais, sinalizando que as interpretações do telespectador infantil revelam a sua maneira de ver o mundo.
Resumo:
A Bayesian procedure for the retrieval of wind vectors over the ocean using satellite borne scatterometers requires realistic prior near-surface wind field models over the oceans. We have implemented carefully chosen vector Gaussian Process models; however in some cases these models are too smooth to reproduce real atmospheric features, such as fronts. At the scale of the scatterometer observations, fronts appear as discontinuities in wind direction. Due to the nature of the retrieval problem a simple discontinuity model is not feasible, and hence we have developed a constrained discontinuity vector Gaussian Process model which ensures realistic fronts. We describe the generative model and show how to compute the data likelihood given the model. We show the results of inference using the model with Markov Chain Monte Carlo methods on both synthetic and real data.
Resumo:
Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.
Resumo:
The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about km800, carrying a C-band scatterometer. A scatterometer measures the amount of radar back scatter generated by small ripples on the ocean surface induced by instantaneous local winds. Operational methods that extract wind vectors from satellite scatterometer data are based on the local inversion of a forward model, mapping scatterometer observations to wind vectors, by the minimisation of a cost function in the scatterometer measurement space.par This report uses mixture density networks, a principled method for modelling conditional probability density functions, to model the joint probability distribution of the wind vectors given the satellite scatterometer measurements in a single cell (the `inverse' problem). The complexity of the mapping and the structure of the conditional probability density function are investigated by varying the number of units in the hidden layer of the multi-layer perceptron and the number of kernels in the Gaussian mixture model of the mixture density network respectively. The optimal model for networks trained per trace has twenty hidden units and four kernels. Further investigation shows that models trained with incidence angle as an input have results comparable to those models trained by trace. A hybrid mixture density network that incorporates geophysical knowledge of the problem confirms other results that the conditional probability distribution is dominantly bimodal.par The wind retrieval results improve on previous work at Aston, but do not match other neural network techniques that use spatial information in the inputs, which is to be expected given the ambiguity of the inverse problem. Current work uses the local inverse model for autonomous ambiguity removal in a principled Bayesian framework. Future directions in which these models may be improved are given.
Resumo:
Analysing investments in ISs in order to maximise benefits has become a prime concern, especially for private corporations. No formula of equilibrium exists that could link the injected amounts and accrued returns. The relationship is simply not straightforward. This thesis is based upon empirical work which involved sketching organisational ethnographies (four organographies and a sectography) into the role and value of information systems in Jordanian financial organisations. Besides deciphering the map of impacts, it explains the attributions of the variations in the impacts of ISs which were found to be related to the internal organisational processes: culturally and politically specific considerations, economically or technically rooted factors and environmental factors. The research serves as an empirical attempt to test out the applicability of adopting the interpretive paradigm to researching organisations in a developing country. The fieldwork comprised an exploratory stage, a detailed investigation of four case studies and a survey stage encompassing 16 organisations. Primary and secondary data were collected from multiple sources using a range of instruments. The evidence highlights the fact that little long term strategic planning was pursued; the emphasis was more focused on short term planning. There was no noticeable adoption of any strategic fit principle linking IS strategy to the corporate strategy. In addition, the benefits obtained were mostly intangible. Although ISs were central to the work of the organisations surveyed as the core technology, they were considered as tools or work enablers rather than weapons for competitive rivalry. The cultural specificity of IS impacts was evident and the cultural and political considerations were key factors in explaining the attributions of the variations in the impacts of ISs in JFOs. The thesis confirms that measuring the benefits of ISs is the problematic. However, in order to gain more insight, the phenomenon of "the use of ISs" has to be studied within its context.
Resumo:
Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.
Resumo:
The sources of ideas embodied within successful technological innovation has been a subject of interest in many studies since the 1950s. This research suggests that sources external to the innovating organisation account for between one and two-thirds of the inputs important to the innovation process. In addition, studies have long highlighted the important role played by the personal boundary-spanning relationships of engineers and scientists as a channel for the transference of such inputs. However, research concerning the role and nature of personal boundary-spanning links in the innovation process have either been primarily structurally orientated, seeking to map out the informal networks of scientists and engineers, or more typically, anecdotal. The objective of this research was to reveal and build upon our knowledge of the role, nature and importance of informal exchange activity in the innovation process. In order to achieve this, an empirical study was undertaken to determine the informal sources, channels and mechanisms employed in the development of thirty five award-winning innovations. Through the adoption of the network perspective, the multiple sources and pluralistic patterns of collaboration and communication in the innovation process were systematically explored. This approach provided a framework that allowed for the detailed study of both the individual dyadic links and morphology of the innovation action-sets in which these dyads were embedded. The research found, for example, that the mobilisation of boundary-spanning links and networks was an important or critical factor in nineteen (54%) of the development projects. Of these, informal boundary-spanning exchange activity was considered to be important or critical in eight (23%).