912 resultados para scalar curvature


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La present tesi, tot i que emmarcada dins de la teoria de les Mesures Semblança Molecular Quántica (MQSM), es deriva en tres àmbits clarament definits: - La creació de Contorns Moleculars de IsoDensitat Electrònica (MIDCOs, de l'anglès Molecular IsoDensity COntours) a partir de densitats electròniques ajustades. - El desenvolupament d'un mètode de sobreposició molecular, alternatiu a la regla de la màxima semblança. - Relacions Quantitatives Estructura-Activitat (QSAR, de l'anglès Quantitative Structure-Activity Relationships). L'objectiu en el camp dels MIDCOs és l'aplicació de funcions densitat ajustades, ideades inicialment per a abaratir els càlculs de MQSM, per a l'obtenció de MIDCOs. Així, es realitza un estudi gràfic comparatiu entre diferents funcions densitat ajustades a diferents bases amb densitats obtingudes de càlculs duts a terme a nivells ab initio. D'aquesta manera, l'analogia visual entre les funcions ajustades i les ab initio obtinguda en el ventall de representacions de densitat obtingudes, i juntament amb els valors de les mesures de semblança obtinguts prèviament, totalment comparables, fonamenta l'ús d'aquestes funcions ajustades. Més enllà del propòsit inicial, es van realitzar dos estudis complementaris a la simple representació de densitats, i són l'anàlisi de curvatura i l'extensió a macromolècules. La primera observació correspon a comprovar no només la semblança dels MIDCOs, sinó la coherència del seu comportament a nivell de curvatura, podent-se així observar punts d'inflexió en la representació de densitats i veure gràficament aquelles zones on la densitat és còncava o convexa. Aquest primer estudi revela que tant les densitats ajustades com les calculades a nivell ab initio es comporten de manera totalment anàloga. En la segona part d'aquest treball es va poder estendre el mètode a molècules més grans, de fins uns 2500 àtoms. Finalment, s'aplica part de la filosofia del MEDLA. Sabent que la densitat electrònica decau ràpidament al allunyar-se dels nuclis, el càlcul d'aquesta pot ser obviat a distàncies grans d'aquests. D'aquesta manera es va proposar particionar l'espai, i calcular tan sols les funcions ajustades de cada àtom tan sols en una regió petita, envoltant l'àtom en qüestió. Duent a terme aquest procés, es disminueix el temps de càlcul i el procés esdevé lineal amb nombre d'àtoms presents en la molècula tractada. En el tema dedicat a la sobreposició molecular es tracta la creació d'un algorisme, així com la seva implementació en forma de programa, batejat Topo-Geometrical Superposition Algorithm (TGSA), d'un mètode que proporcionés aquells alineaments que coincideixen amb la intuïció química. El resultat és un programa informàtic, codificat en Fortran 90, el qual alinea les molècules per parelles considerant tan sols nombres i distàncies atòmiques. La total absència de paràmetres teòrics permet desenvolupar un mètode de sobreposició molecular general, que proporcioni una sobreposició intuïtiva, i també de forma rellevant, de manera ràpida i amb poca intervenció de l'usuari. L'ús màxim del TGSA s'ha dedicat a calcular semblances per al seu ús posterior en QSAR, les quals majoritàriament no corresponen al valor que s'obtindria d'emprar la regla de la màxima semblança, sobretot si hi ha àtoms pesats en joc. Finalment, en l'últim tema, dedicat a la Semblança Quàntica en el marc del QSAR, es tracten tres aspectes diferents: - Ús de matrius de semblança. Aquí intervé l'anomenada matriu de semblança, calculada a partir de les semblances per parelles d'entre un conjunt de molècules. Aquesta matriu és emprada posteriorment, degudament tractada, com a font de descriptors moleculars per a estudis QSAR. Dins d'aquest àmbit s'han fet diversos estudis de correlació d'interès farmacològic, toxicològic, així com de diverses propietats físiques. - Aplicació de l'energia d'interacció electró-electró, assimilat com a una forma d'autosemblança. Aquesta modesta contribució consisteix breument en prendre el valor d'aquesta magnitud, i per analogia amb la notació de l'autosemblança molecular quàntica, assimilar-la com a cas particular de d'aquesta mesura. Aquesta energia d'interacció s'obté fàcilment a partir de programari mecanoquàntic, i esdevé ideal per a fer un primer estudi preliminar de correlació, on s'utilitza aquesta magnitud com a únic descriptor. - Càlcul d'autosemblances, on la densitat ha estat modificada per a augmentar el paper d'un substituent. Treballs previs amb densitats de fragments, tot i donar molt bons resultats, manquen de cert rigor conceptual en aïllar un fragment, suposadament responsable de l'activitat molecular, de la totalitat de l'estructura molecular, tot i que les densitats associades a aquest fragment ja difereixen degut a pertànyer a esquelets amb diferents substitucions. Un procediment per a omplir aquest buit que deixa la simple separació del fragment, considerant així la totalitat de la molècula (calcular-ne l'autosemblança), però evitant al mateix temps valors d'autosemblança no desitjats provocats per àtoms pesats, és l'ús de densitats de Forats de fermi, els quals es troben definits al voltant del fragment d'interès. Aquest procediment modifica la densitat de manera que es troba majoritàriament concentrada a la regió d'interès, però alhora permet obtenir una funció densitat, la qual es comporta matemàticament igual que la densitat electrònica regular, podent-se així incorporar dins del marc de la semblança molecular. Les autosemblances calculades amb aquesta metodologia han portat a bones correlacions amb àcids aromàtics substituïts, podent així donar una explicació al seu comportament. Des d'un altre punt de vista, també s'han fet contribucions conceptuals. S'ha implementat una nova mesura de semblança, la d'energia cinètica, la qual consisteix en prendre la recentment desenvolupada funció densitat d'energia cinètica, la qual al comportar-se matemàticament igual a les densitats electròniques regulars, s'ha incorporat en el marc de la semblança. A partir d'aquesta mesura s'han obtingut models QSAR satisfactoris per diferents conjunts moleculars. Dins de l'aspecte del tractament de les matrius de semblança s'ha implementat l'anomenada transformació estocàstica com a alternativa a l'ús de l'índex Carbó. Aquesta transformació de la matriu de semblança permet obtenir una nova matriu no simètrica, la qual pot ser posteriorment tractada per a construir models QSAR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelo de predicción de la geometría final de una pieza de chapa, radio y ángulo de doblado final, producida mediante un proceso de doblado al aire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La formiga argentina (Linepithema humile) es troba entre les espècies més invasores: originària d'Amèrica del Sud, actualment ha envaït nombroses àrees arreu del món. Aquesta tesi doctoral intenta fer una primera anàlisi integrada i multiescalar de la distribució de la formiga argentina mitjançant l'ús de models de nínxol ecològic. D'acord amb els resultats obtinguts, es preveu que la formiga argentina assoleixi una distribució més àmplia que l'actual. Les prediccions obtingudes a partir dels models concorden amb la distribució actualment coneguda i, a més, indiquen àrees a prop de la costa i dels rius principals com a altament favorables per a l'espècie. Aquests resultats corroboren la idea que la formiga argentina no es troba actualment en equilibri amb el medi. D'altra banda, amb el canvi climàtic, s'espera que la distribució de la formiga argentina s'estengui cap a latituds més elevades en ambdós hemisferis, i sofreixi una retracció en els tròpics a escales globals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A primary interest of this thesis is to obtain a powerful tool for determining structural properties, electrical and reactivity of molecules. A second interest is the study of fundamental error based on complex overlay of bridge hydrogen. One way to correct this error, using Counterpoise correction proposed by Boys and Bernardi. Usually the Counterpoise correction is applied promptly on the geometries previously optimized. Our goal was to find areas of potential which had all the points fixed with CP. These surfaces have a minimum corresponding to a surface other than corrected, ie, the geometric parameters will be different. The curvature of this minimum will also be different, therefore the vibrational frequency will also change when they are corrected with BSSE. Once constructed these surfaces have been studied various complex. It has also been investigated as the method for calculating the error influenced on the basis superposition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The and RT0 finite element schemes are among the most promising low order elements for use in unstructured mesh marine and lake models. They are both free of spurious elevation modes, have good dispersive properties and have a relatively low computational cost. In this paper, we derive both finite element schemes in the same unified framework and discuss their respective qualities in terms of conservation, consistency, propagation factor and convergence rate. We also highlight the impact that the local variables placement can have on the model solution. The main conclusion that we can draw is that the choice between elements is highly application dependent. We suggest that the element is better suited to purely hydrodynamical applications while the RT0 element might perform better for hydrological applications that require scalar transport calculations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Separation of stratified flow over a two-dimensional hill is inhibited or facilitated by acceleration or deceleration of the flow just outside the attached boundary layer. In this note, an expression is derived for this acceleration or deceleration in terms of streamline curvature and stratification. The expression is valid for linear as well as nonlinear deformation of the flow. For hills of vanishing aspect ratio a linear theory can be derived and a full regime diagram for separation can be constructed. For hills of finite aspect ratio scaling relationships can be derived that indicate the presence of a critical aspect ratio, proportional to the stratification, above which separation will occur as well as a second critical aspect ratio above which separation will always occur irrespective of stratification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study ordinary nonlinear singular differential equations which arise from steady conservation laws with source terms. An example of steady conservation laws which leads to those scalar equations is the Saint–Venant equations. The numerical solution of these scalar equations is sought by using the ideas of upwinding and discretisation of source terms. Both the Engquist–Osher scheme and the Roe scheme are used with different strategies for discretising the source terms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the relationship between the mean temperature and humidity profiles and the fluxes of heat and moisture at cloud base and the base of the inversion in the cumulus-capped boundary layer. The relationships derived are based on an approximate form of the scalar-flux budget and the scaling properties of the turbulent kinetic energy (TKE) budget. The scalar-flux budget gives a relationship between the change in the virtual potential temperature across either the cloud base transition zone or the inversion and the flux at the base of the layer. The scaling properties of the TKE budget lead to a relationship between the heat and moisture fluxes and the mean subsaturation through the liquid-water flux. The 'jump relation' for the virtual potential temperature at cloud base shows the close connection between the cumulus mass flux in the cumulus-capped boundary layer and the entrainment velocity in the dry-convective boundary layer. Gravity waves are shown to be an important feature of the inversion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scalar-flux budgets have been obtained from large-eddy simulations (LESs) of the cumulus-capped boundary layer. Parametrizations of the terms in the budgets are discussed, and two parametrizations for the transport term in the cloud layer are proposed. It is shown that these lead to two models for scalar transports by shallow cumulus convection. One is equivalent to the subsidence detrainment form of convective tendencies obtained from mass-flux parametrizations of cumulus convection. The second is a flux-gradient relationship that is similar in form to the non-local parametrizations of turbulent transports in the dry-convective boundary layer. Using the fluxes of liquid-water potential temperature and total water content from the LES, it is shown that both models are reasonable diagnostic relations between fluxes and the vertical gradients of the mean fields. The LESs used in this study are for steady-state convection and it is possible to treat the fluxes of conserved thermodynamic variables as independent, and ignore the effects of condensation. It is argued that a parametrization of cumulus transports in a model of the cumulus-capped boundary layer should also include an explicit representation of condensation. A simple parametrization of the liquid-water flux in terms of conserved variables is also derived.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the politics of place in relation to legal mobilization by the anti-nuclear movement. It examines two case examples - citizens' weapons inspections and civil disobedience strategies - which have involved the movement drawing upon the law in particular spatial contexts. The article begins by examining a number of factors which have been employed in recent social movement literature to explain strategy choice, including ideology, resources, political and legal opportunity, and framing. It then proceeds to argue that the issues of scale, space, and place play an important role in relation to framing by the movement in the two case examples. Both can be seen to involve scalar reframing, with the movement attempting to resist localizing tendencies and to replace them with a global frame. Both also involve an attempt to reframe the issue of nuclear weapons away from the contested frame of the past (unilateral disarmament) towards the more universal and widely accepted frame of international law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new spectral method for solving initial boundary value problems for linear and integrable nonlinear partial differential equations in two independent variables is applied to the nonlinear Schrödinger equation and to its linearized version in the domain {x≥l(t), t≥0}. We show that there exist two cases: (a) if l″(t)<0, then the solution of the linear or nonlinear equations can be obtained by solving the respective scalar or matrix Riemann-Hilbert problem, which is defined on a time-dependent contour; (b) if l″(t)>0, then the Riemann-Hilbert problem is replaced by a respective scalar or matrix problem on a time-independent domain. In both cases, the solution is expressed in a spectrally decomposed form.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we estimate a Translog output distance function for a balanced panel of state level data for the Australian dairy processing sector. We estimate a fixed effects specification employing Bayesian methods, with and without the imposition of monotonicity and curvature restrictions. Our results indicate that Tasmania and Victoria are the most technically efficient states with New South Wales being the least efficient. The imposition of theoretical restrictions marginally affects the results especially with respect to estimates of technical change and industry deregulation. Importantly, our bias estimates show changes in both input use and output mix that result from deregulation. Specifically, we find that deregulation has positively biased the production of butter, cheese and powders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Twenty-eight field experiments on sandy-loam soils in the UK (1982-2003) are reviewed by relating the extension of the green area duration of the flag leaf (GLADF) by fungicides to effects on yield and quality of winter wheat. Over all experiments mean grain yield = 8.85t ha(-1) at 85% DM. With regards quality, mean values were: thousand grain weight (TGW) = 44.5 g; specific weight (SWT) = 76.9 kg hl(-1); crude protein concentration (CP (N x 5.7)) = 12.5 % DM; Hagberg falling number (HFN) = 285 s; and sodium dodecyl sulphate (SDS)-sedimentation volume = 69ml. For each day (d) that fungicides increased GLADF there were associated average increases in yield (0.144 1 ha(-1) d(-1), se 0.0049, df = 333), TGW (0.56 gd(-1), se = 0.017) and SWT (0.22 kg hl(-1) d(-1), se 0.011). Some curvature was evident in all these relationships. When GLADF was delayed beyond 700 degrees Cd after anthesis, as was possible in cool wet seasons, responses were curtailed, or less reliable. Despite this apparent terminal sink limitation, fungicide effects on sink size, eg endosperm cell numbers or maximum water mass per grain, were not prerequisites for large effects on grain yield, TGW or SWT. Fungicide effects on CP were variable. Although the average response of CP was negative (-0.029%DM/d; se = 0.00338), this depended on cultivar and disease controlled. Controlling biotrophs such as rusts, (Puccinia spp.) tended to increase CP, whereas controlling a more necrotrophic pathogen (Septoria tritici) usually reducedCP. Irrespective of pathogen controlled, delaying senescence of the flag leaf was associated with increased nitrogen yields in the grain (averaging 2.24 kg N ha-1 d(-1), se = 0.0848) due to both increased N uptake into the above ground crop, and also more efficient remobilisation of N from leaf laminas. When sulphur availability appeared to be adequate, fungicide x cultivar interactions were similar on S as for CP, although N:S ratios tended to decline (i.e. improve for bread making) when S. tritici was controlled. On average, SDS-sedimentation volume declined (-0. 18 ml/d, se = 0.027) with increased GLADF, broadly commensurate with the average effect on CP. Hagberg falling number decreased as fungicide increased GLADF (-2.73 s/d, se = 0.178), indicating an increase in alpha-amylase activity.