97 resultados para Hough-Radon transform
Resumo:
A fundamental question in developmental biology is how tissues are patterned to give rise to differentiated body structures with distinct morphologies. The Drosophila wing disc offers an accessible model to understand epithelial spatial patterning. It has been studied extensively using genetic and molecular approaches. Bristle patterns on the thorax, which arise from the medial part of the wing disc, are a classical model of pattern formation, dependent on a pre-pattern of trans-activators and –repressors. Despite of decades of molecular studies, we still only know a subset of the factors that determine the pre-pattern. We are applying a novel and interdisciplinary approach to predict regulatory interactions in this system. It is based on the description of expression patterns by simple logical relations (addition, subtraction, intersection and union) between simple shapes (graphical primitives). Similarities and relations between primitives have been shown to be predictive of regulatory relationships between the corresponding regulatory factors in other Systems, such as the Drosophila egg. Furthermore, they provide the basis for dynamical models of the bristle-patterning network, which enable us to make even more detailed predictions on gene regulation and expression dynamics. We have obtained a data-set of wing disc expression patterns which we are now processing to obtain average expression patterns for each gene. Through triangulation of the images we can transform the expression patterns into vectors which can easily be analysed by Standard clustering methods. These analyses will allow us to identify primitives and regulatory interactions. We expect to identify new regulatory interactions and to understand the basic Dynamics of the regulatory network responsible for thorax patterning. These results will provide us with a better understanding of the rules governing gene regulatory networks in general, and provide the basis for future studies of the evolution of the thorax-patterning network in particular.
Resumo:
The pseudo-spectral time-domain (PSTD) method is an alternative time-marching method to classicalleapfrog finite difference schemes in the simulation of wave-like propagating phenomena. It is basedon the fundamentals of the Fourier transform to compute the spatial derivatives of hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acoustics simulations. However, one of the first issues to be solved consists on modeling wallabsorption. Unfortunately, there are no references in the technical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals to overcome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
The Pseudo-Spectral Time Domain (PSTD) method is an alternative time-marching method to classical leapfrog finite difference schemes inthe simulation of wave-like propagating phenomena. It is based on the fundamentals of the Fourier transform to compute the spatial derivativesof hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acousticssimulations. However, one of the first issues to be solved consists on modeling wall absorption. Unfortunately, there are no references in thetechnical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals toovercome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
Error-correcting codes and matroids have been widely used in the study of ordinary secret sharing schemes. In this paper, the connections between codes, matroids, and a special class of secret sharing schemes, namely, multiplicative linear secret sharing schemes (LSSSs), are studied. Such schemes are known to enable multiparty computation protocols secure against general (nonthreshold) adversaries.Two open problems related to the complexity of multiplicative LSSSs are considered in this paper. The first one deals with strongly multiplicative LSSSs. As opposed to the case of multiplicative LSSSs, it is not known whether there is an efficient method to transform an LSSS into a strongly multiplicative LSSS for the same access structure with a polynomial increase of the complexity. A property of strongly multiplicative LSSSs that could be useful in solving this problem is proved. Namely, using a suitable generalization of the well-known Berlekamp–Welch decoder, it is shown that all strongly multiplicative LSSSs enable efficient reconstruction of a shared secret in the presence of malicious faults. The second one is to characterize the access structures of ideal multiplicative LSSSs. Specifically, the considered open problem is to determine whether all self-dual vector space access structures are in this situation. By the aforementioned connection, this in fact constitutes an open problem about matroid theory, since it can be restated in terms of representability of identically self-dual matroids by self-dual codes. A new concept is introduced, the flat-partition, that provides a useful classification of identically self-dual matroids. Uniform identically self-dual matroids, which are known to be representable by self-dual codes, form one of the classes. It is proved that this property also holds for the family of matroids that, in a natural way, is the next class in the above classification: the identically self-dual bipartite matroids.
Resumo:
This article introduces a model of rationality that combines procedural utility over actions with consequential utility over payoffs. It applies the model to the Prisoners Dilemma and shows that empirically observed cooperative behaviors can be rationally explained by a procedural utility for cooperation. The model characterizes the situations in which cooperation emerges as a Nash equilibrium. When rational individuals are not solely concerned by the consequences of their behavior but also care for the process by which these consequences are obtained, there is no one single rational solution to a Prisoners Dilemma. Rational behavior depends on the payoffs at stake and on the procedural utility of individuals. In this manner, this model of procedural utility reflects how ethical considerations, social norms or emotions can transform a game of consequences.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
A general formalism on stochastic choice is presented. Tje Rationalizability and Recoverability (Identification) problems are discussed. For the identification issue parametric examples are analyzed by means of techniques of mathematical tomography (Random transforms).
Resumo:
Subcompositional coherence is a fundamental property of Aitchison s approach to compositional data analysis, and is the principal justification for using ratios of components. We maintain, however, that lack of subcompositional coherence, that is incoherence, can be measured in an attempt to evaluate whether any given technique is close enough, for all practical purposes, to being subcompositionally coherent. This opens up the field to alternative methods, which might be better suited to cope with problems such as data zeros and outliers, while being only slightly incoherent. The measure that we propose is based on the distance measure between components. We show that the two-part subcompositions, which appear to be the most sensitive to subcompositional incoherence, can be used to establish a distance matrix which can be directly compared with the pairwise distances in the full composition. The closeness of these two matrices can be quantified using a stress measure that is common in multidimensional scaling, providing a measure of subcompositional incoherence. The approach is illustrated using power-transformed correspondence analysis, which has already been shown to converge to log-ratio analysis as the power transform tends to zero.
Resumo:
Some natural resources oil and minerals in particular exert a negative andnonlinear impact on growth via their deleterious impact on institutionalquality. We show this result to be very robust. The Nigerian experienceprovides telling confirmation of this aspect of natural resources. Wasteand corruption from oil rather than Dutch disease has been responsible forits poor long run economic performance. We propose a solution for addressingthis resource curse which involves directly distributing the oil revenuesto the public. Even with all the difficulties of corruption and inefficiencythat will no doubt plague its actual implementation, our proposal will, atthe least, be vastly superior to the status quo. At best, however, it couldfundamentally improve the quality of public institutions and, as a result,transform economics and politics in Nigeria.
Resumo:
This article introduces a model of rationality that combines procedural utility over actions with consequential utility over payoffs. It applies the model to the Prisoners Dilemma and shows that empirically observed cooperative behaviors can be rationally explained by a procedural utility for cooperation. The model characterizes the situations in which cooperation emerges as a Nash equilibrium. When rational individuals are not solely concerned by the consequences of their behavior but also care for the process by which these consequences are obtained, there is no one single rational solution to a Prisoners Dilemma. Rational behavior depends on the payoffs at stake and on the procedural utility of individuals. In this manner, this model of procedural utility reflects how ethical considerations, social norms or emotions can transform a game of consequences.
Resumo:
Initiatives in electronic conveyancing and registration show the potential of new technologies to transform such systems, reducing costs and enhancing legal security. However,they also incur substantial risks of transferring costs and risks among registries, conveyancersand rightholders, instead of reducing them; entrenching the private interests of conveyancers,instead of increasing competition and disintermediating them; modifying the allocation of tasksin a way that leads in the long term to the debasement of registries of rights with indefeasibletitle into mere recordings of deeds; and empowering conveyancers instead of transactors andrightholders, which increases costs and reduces security. Fulfilling the promise of newtechnologies in both costs and security requires strengthening registries incentives andempowering rightholders in their interaction with registries.
Resumo:
Several features that can be extracted from digital images of the sky and that can be useful for cloud-type classification of such images are presented. Some features are statistical measurements of image texture, some are based on the Fourier transform of the image and, finally, others are computed from the image where cloudy pixels are distinguished from clear-sky pixels. The use of the most suitable features in an automatic classification algorithm is also shown and discussed. Both the features and the classifier are developed over images taken by two different camera devices, namely, a total sky imager (TSI) and a whole sky imager (WSC), which are placed in two different areas of the world (Toowoomba, Australia; and Girona, Spain, respectively). The performance of the classifier is assessed by comparing its image classification with an a priori classification carried out by visual inspection of more than 200 images from each camera. The index of agreement is 76% when five different sky conditions are considered: clear, low cumuliform clouds, stratiform clouds (overcast), cirriform clouds, and mottled clouds (altocumulus, cirrocumulus). Discussion on the future directions of this research is also presented, regarding both the use of other features and the use of other classification techniques
Resumo:
Tot seguit presentem un entorn per analitzar senyals de tot tipus amb LDB (Local Discriminant Bases) i MLDB (Modified Local Discriminant Bases). Aquest entorn utilitza funcions desenvolupades en el marc d’una tesi en fase de desenvolupament. Per entendre part d’aquestes funcions es requereix un nivell de coneixement avançat de processament de senyals. S’han extret dels treballs realitzats per Naoki Saito [3], que s’han agafat com a punt de partida per la realització de l’algorisme de la tesi doctoral no finalitzada de Jose Antonio Soria. Aquesta interfície desenvolupada accepta la incorporació de nous paquets i funcions. Hem deixat un menú preparat per integrar Sinus IV packet transform i Cosine IV packet transform, tot i que també podem incorporar-n’hi altres. L’aplicació consta de dues interfícies, un Assistent i una interfície principal. També hem creat una finestra per importar i exportar les variables desitjades a diferents entorns. Per fer aquesta aplicació s’han programat tots els elements de les finestres, en lloc d’utilitzar el GUIDE (Graphical User Interface Development Enviroment) de MATLAB, per tal que sigui compatible entre les diferents versions d’aquest programa. En total hem fet 73 funcions en la interfície principal (d’aquestes, 10 pertanyen a la finestra d’importar i exportar) i 23 en la de l’Assistent. En aquest treball només explicarem 6 funcions i les 3 de creació d’aquestes interfícies per no fer-lo excessivament extens. Les funcions que explicarem són les més importants, ja sigui perquè s’utilitzen sovint, perquè, segons la complexitat McCabe, són les més complicades o perquè són necessàries pel processament del senyal. Passem cada entrada de dades per part de l’usuari per funcions que ens detectaran errors en aquesta entrada, com eliminació de zeros o de caràcters que no siguin números, com comprovar que són enters o que estan dins dels límits màxims i mínims que li pertoquen.
Resumo:
Un dels principals problemes quan es realitza un anàlisi de contorns és la gran quantitat de dades implicades en la descripció de la figura. Per resoldre aquesta problemàtica, s’aplica la parametrització que consisteix en obtenir d’un contorn unes dades representatives amb els mínims coeficients possibles, a partir dels quals es podrà reconstruir de nou sense pèrdues molt evidents d’informació. En figures de contorns tancats, la parametrització més estudiada és l’aplicació de la transformada discreta de Fourier (DFT). Aquesta s’aplica a la seqüència de valors que descriu el comportament de les coordenades x i y al llarg de tots els punts que formen el traç. A diferència, en els contorns oberts no es pot aplicar directament la DFT ja que per fer-ho es necessita que el valor de x i de y siguin iguals tan en el primer punt del contorn com en l’últim. Això és degut al fet que la DFT representa sense error senyals periòdics. Si els senyals no acaben en el mateix punt, representa que hi ha una discontinuïtat i apareixen oscil·lacions a la reconstrucció. L’objectiu d’aquest treball és parametritzar contorns oberts amb la mateixa eficiència que s’obté en la parametrització de contorns tancats. Per dur-ho a terme, s’ha dissenyat un programa que permet aplicar la DFT en contorns oberts mitjançant la modificació de les seqüencies de x i y. A més a més, també utilitzant el programari Matlab s’han desenvolupat altres aplicacions que han permès veure diferents aspectes sobre la parametrització i com es comporten els Descriptors El·líptics de Fourier (EFD). Els resultats obtinguts han demostrat que l’aplicació dissenyada permet la parametrització de contorns oberts amb compressions òptimes, fet que facilitarà l’anàlisi quantitatiu de formes en camps com l’ecologia, medicina, geografia, entre d’altres.
Resumo:
RESUMEN La energia eolica se considera una forma indirecta de energia solar. Entre el 1 y 2% de la energia proveniente del sol se convierte en viento, debido al movimiento del aire ocasionado por el desigual calentamiento de la superficie terrestre. La energia cinetica del viento puede transformarse en energia util, tanto mecanica como electrica. La energia eolica, transformada en energia mecanica ha sido historicamente aprovechada, pero su uso para la generacion de energia electrica es mas reciente, en respuesta a la crisis del petroleo y a los impactos ambientales derivados del uso de combustibles. El objetivo principal de este trabajo es hacer un analisis de viabilidad desde un punto de vista tecnico y economico de un parque eolico situado en el municipio de Barasoain (Navarra). Desde el punto de vista tecnico se han estudiado los aspectos constructivos del parque considerando sus diferentes infraestructuras de obra civil y electrica asi como los niveles de recurso eolico. En el ambito economico y financiero se han analizado los aspectos y ratios mas relevantes que definen un proyecto de estas caracteristicas asi como el modelo de financiacion elegida basada en el Project- Finance. Entre las conclusiones mas destacadas de este proyecto cabe destacar la contribucion de la construccion del parque al desarrollo social y economico de la zona donde queda ubicado contribuyendo a la creacion de puestos de trabajo, tanto en la fase de construccion como de explotacion y una perfecta armonia con condicionantes medioambientales de la zona. El analisis tecnico realizado nos indica la viabilidad tecnica del parque tanto desde el punto de vista de recurso eolico como la idoneidad para poder evacuar la energia producida. Por otro lado, los resultados obtenidos cumplen perfectamente con los estandares requeridos por los financiadores de los parques y resultando ser muy atractivos para sus accionistas