948 resultados para Laplace transforms


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurately and reliably identifying the actual number of clusters present with a dataset of gene expression profiles, when no additional information on cluster structure is available, is a problem addressed by few algorithms. GeneMCL transforms microarray analysis data into a graph consisting of nodes connected by edges, where the nodes represent genes, and the edges represent the similarity in expression of those genes, as given by a proximity measurement. This measurement is taken to be the Pearson correlation coefficient combined with a local non-linear rescaling step. The resulting graph is input to the Markov Cluster (MCL) algorithm, which is an elegant, deterministic, non-specific and scalable method, which models stochastic flow through the graph. The algorithm is inherently affected by any cluster structure present, and rapidly decomposes a graph into cohesive clusters. The potential of the GeneMCL algorithm is demonstrated with a 5730 gene subset (IGS) of the Van't Veer breast cancer database, for which the clusterings are shown to reflect underlying biological mechanisms. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a hybrid control strategy integrating dynamic neural networks and feedback linearization into a predictive control scheme. Feedback linearization is an important nonlinear control technique which transforms a nonlinear system into a linear system using nonlinear transformations and a model of the plant. In this work, empirical models based on dynamic neural networks have been employed. Dynamic neural networks are mathematical structures described by differential equations, which can be trained to approximate general nonlinear systems. A case study based on a mixing process is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an approach for automatic classification of pulsed Terahertz (THz), or T-ray, signals highlighting their potential in biomedical, pharmaceutical and security applications. T-ray classification systems supply a wealth of information about test samples and make possible the discrimination of heterogeneous layers within an object. In this paper, a novel technique involving the use of Auto Regressive (AR) and Auto Regressive Moving Average (ARMA) models on the wavelet transforms of measured T-ray pulse data is presented. Two example applications are examined - the classi. cation of normal human bone (NHB) osteoblasts against human osteosarcoma (HOS) cells and the identification of six different powder samples. A variety of model types and orders are used to generate descriptive features for subsequent classification. Wavelet-based de-noising with soft threshold shrinkage is applied to the measured T-ray signals prior to modeling. For classi. cation, a simple Mahalanobis distance classi. er is used. After feature extraction, classi. cation accuracy for cancerous and normal cell types is 93%, whereas for powders, it is 98%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The role of users is an often-overlooked aspect of studies of innovation and diffusion. Using an actor-network theory (ANT) approach, four case studies examine the processes of implementing a piece of CAD (computer aided design) software, BSLink, in different organisations and describe the tailoring done by users to embed the software into working practices. This not only results in different practices of use at different locations, but also transforms BSLink itself into a proliferation of BSLinks-in-use. A focus group for BSLink users further reveals the gaps between different users' expectations and ways of using the software, and between different BSLinks-in-use. It also demonstrates the contradictory demands this places on its further development. The ANT-informed approach used treats both innovation and diffusion as processes of translation within networks. It also emphasises the political nature of innovation and implementation, and the efforts of various actors to delegate manoeuvres for increased influence onto technological artefacts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the classical coupled, combined-field integral equation formulations for time-harmonic acoustic scattering by a sound soft bounded obstacle. In recent work, we have proved lower and upper bounds on the $L^2$ condition numbers for these formulations, and also on the norms of the classical acoustic single- and double-layer potential operators. These bounds to some extent make explicit the dependence of condition numbers on the wave number $k$, the geometry of the scatterer, and the coupling parameter. For example, with the usual choice of coupling parameter they show that, while the condition number grows like $k^{1/3}$ as $k\to\infty$, when the scatterer is a circle or sphere, it can grow as fast as $k^{7/5}$ for a class of `trapping' obstacles. In this paper we prove further bounds, sharpening and extending our previous results. In particular we show that there exist trapping obstacles for which the condition numbers grow as fast as $\exp(\gamma k)$, for some $\gamma>0$, as $k\to\infty$ through some sequence. This result depends on exponential localisation bounds on Laplace eigenfunctions in an ellipse that we prove in the appendix. We also clarify the correct choice of coupling parameter in 2D for low $k$. In the second part of the paper we focus on the boundary element discretisation of these operators. We discuss the extent to which the bounds on the continuous operators are also satisfied by their discrete counterparts and, via numerical experiments, we provide supporting evidence for some of the theoretical results, both quantitative and asymptotic, indicating further which of the upper and lower bounds may be sharper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For individuals with upper-extremity motor disabilities, the head-stick is a simple and intuitive means of performing manipulations because it provides direct proprioceptive information to the user. Through practice and use of inherent proprioceptive cues, users may become quite adept at using the head-stick for a number of different tasks. The traditional head-stick is limited, however, to the user's achievable range of head motion and force generation, which may be insufficient for many tasks. The authors describe an interface to a robot system which emulates the proprioceptive qualities of a traditional head-stick while also allowing for augmented end-effector ranges of force and motion. The design and implementation of the system in terms of coordinate transforms, bilateral telemanipulator architecture, safety systems, and system identification of the master is described, in addition to preliminary evaluation results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper will present a conceptual framework for the examination of land redevelopment based on a complex systems/networks approach. As Alvin Toffler insightfully noted, modern scientific enquiry has become exceptionally good at splitting problems into pieces but has forgotten how to put the pieces back together. Twenty-five years after his remarks, governments and corporations faced with the requirements of sustainability are struggling to promote an ‘integrated’ or ‘holistic’ approach to tackling problems. Despite the talk, both practice and research provide few platforms that allow for ‘joined up’ thinking and action. With socio-economic phenomena, such as land redevelopment, promising prospects open up when we assume that their constituents can make up complex systems whose emergent properties are more than the sum of the parts and whose behaviour is inherently difficult to predict. A review of previous research shows that it has mainly focused on idealised, ‘mechanical’ views of property development processes that fail to recognise in full the relationships between actors, the structures created and their emergent qualities. When reality failed to live up to the expectations of these theoretical constructs then somebody had to be blamed for it: planners, developers, politicians. However, from a ‘synthetic’ point of view the agents and networks involved in property development can be seen as constituents of structures that perform complex processes. These structures interact, forming new more complex structures and networks. Redevelopment then can be conceptualised as a process of transformation: a complex system, a ‘dissipative’ structure involving developers, planners, landowners, state agencies etc., unlocks the potential of previously used sites, transforms space towards a higher order of complexity and ‘consumes’ but also ‘creates’ different forms of capital in the process. Analysis of network relations point toward the ‘dualism’ of structure and agency in these processes of system transformation and change. Insights from actor network theory can be conjoined with notions of complexity and chaos to build an understanding of the ways in which actors actively seek to shape these structures and systems, whilst at the same time are recursively shaped by them in their strategies and actions. This approach transcends the blame game and allows for inter-disciplinary inputs to be placed within a broader explanatory framework that does away with many past dichotomies. Better understanding of the interactions between actors and the emergent qualities of the networks they form can improve our comprehension of the complex socio-spatial phenomena that redevelopment comprises. The insights that this framework provides when applied in UK institutional investment into redevelopment are considered to be significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give an asymptotic expansion for the Taylor coe±cients of L(P(z)) where L(z) is analytic in the open unit disc whose Taylor coe±cients vary `smoothly' and P(z) is a probability generating function. We show how this result applies to a variety of problems, amongst them obtaining the asymptotics of Bernoulli transforms and weighted renewal sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An X-ray amorphous mixture of carbamazepine and indomethacin transforms upon annealing to produce a novel 1:1 cocrystal, whose structure has been determined from laboratory powder X-ray diffraction (PXRD) data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vekua operators map harmonic functions defined on domain in \mathbb R2R2 to solutions of elliptic partial differential equations on the same domain and vice versa. In this paper, following the original work of I. Vekua (Ilja Vekua (1907–1977), Soviet-Georgian mathematician), we define Vekua operators in the case of the Helmholtz equation in a completely explicit fashion, in any space dimension N ≥ 2. We prove (i) that they actually transform harmonic functions and Helmholtz solutions into each other; (ii) that they are inverse to each other; and (iii) that they are continuous in any Sobolev norm in star-shaped Lipschitz domains. Finally, we define and compute the generalized harmonic polynomials as the Vekua transforms of harmonic polynomials. These results are instrumental in proving approximation estimates for solutions of the Helmholtz equation in spaces of circular, spherical, and plane waves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, we compare two different cyclone-tracking algorithms to detect North Atlantic polar lows, which are very intense mesoscale cyclones. Both approaches include spatial filtering, detection, tracking and constraints specific to polar lows. The first method uses digital bandpass-filtered mean sea level pressure (MSLP) fieldsin the spatial range of 200�600 km and is especially designed for polar lows. The second method also uses a bandpass filter but is based on the discrete cosine transforms (DCT) and can be applied to MSLP and vorticity fields. The latter was originally designed for cyclones in general and has been adapted to polar lows for this study. Both algorithms are applied to the same regional climate model output fields from October 1993 to September 1995 produced from dynamical downscaling of the NCEP/NCAR reanalysis data. Comparisons between these two methods show that different filters lead to different numbers and locations of tracks. The DCT is more precise in scale separation than the digital filter and the results of this study suggest that it is more suited for the bandpass filtering of MSLP fields. The detection and tracking parts also influence the numbers of tracks although less critically. After a selection process that applies criteria to identify tracks of potential polar lows, differences between both methods are still visible though the major systems are identified in both.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional “climate modeling” source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New representations and efficient calculation methods are derived for the problem of propagation from an infinite regularly spaced array of coherent line sources above a homogeneous impedance plane, and for the Green's function for sound propagation in the canyon formed by two infinitely high, parallel rigid or sound soft walls and an impedance ground surface. The infinite sum of source contributions is replaced by a finite sum and the remainder is expressed as a Laplace-type integral. A pole subtraction technique is used to remove poles in the integrand which lie near the path of integration, obtaining a smooth integrand, more suitable for numerical integration, and a specific numerical integration method is proposed. Numerical experiments show highly accurate results across the frequency spectrum for a range of ground surface types. It is expected that the methods proposed will prove useful in boundary element modeling of noise propagation in canyon streets and in ducts, and for problems of scattering by periodic surfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the problem of propagation from a monofrequency coherent line source above a plane of homogeneous surface impedance. The solution of this problem occurs in the kernel of certain boundary integral equation formulations of acoustic propagation above an impedance boundary, and the discussion of the paper is motivated by this application. The paper starts by deriving representations, as Laplace-type integrals, of the solution and its first partial derivatives. The evaluation of these integral representations by Gauss-Laguerre quadrature is discussed, and theoretical bounds on the truncation error are obtained. Specific approximations are proposed which are shown to be accurate except in the very near field, for all angles of incidence and a wide range of values of surface impedance. The paper finishes with derivations of partial results and analogous Laplace-type integral representations for the case of a point source.