945 resultados para online application form
Resumo:
Síntesi de nous complexos de Ruteni amb lligands no quirals que tenen per fórmula [Ru(phen)([9]aneS3)X] (on X = H2O, py i MeCN). Caracterització espectroscòpica electroquímica i estructural d'aquesta família de complexos. Estudi de les seves propietats catalítiques en front a l'oxidació de substrats orgànics com l'alcohol benzílic en reaccions d'electrocatàlisi. Avaluació cinètica dels mecanismes de substitució entre els complexos Ru-py i Ru-MeCN. Generació d'un interruptor molecular foto-induït. Síntesi de nous complexos quirals de Ru atropoisomèricament purs amb lligands oxazolínics que tenen per fórmula [Ru(trpy)(Ph-box-R)X] on (X = Cl, H2O, py, MeCN, 2-OH-py). Caracterització estructural exhaustiva en estat sòlid (Raig-X) en solució (RMN) i en fase gas (càlculs DFT). Avaluació de la seva activitat catalítica en reaccions asimmetriques d'epoxidació de substrats proquirals. Síntesi de nous lligands polipiridílics quirals amb simetria C3. Estudi de la seva química de coordinació i avaluació de la seva activitat catalítica en reaccions asimmetriques d'oxidació i reducció.
Resumo:
Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.
Resumo:
Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.
Resumo:
Bioturbation at all scales, which tends to replace the primary fabric of a sediment by the ichnofabric (the overall fabric of a sediment that has been bioturbated), is now recognised as playing a major role in facies interpretation. The manner in which the substrate may be colonized, and the physical, chemical and ecological controls (grainsize, sedimentation rate, oxygenation, nutrition, salinity, ethology, community structure and succession), together with the several ways in which the substrate is tiered by bioturbators, are the factors and processes that determine the nature of the ichnofabric. Eleven main styles of substrate tiering are described, ranging from single, pioneer colonization to complex tiering under equilibria, their modification under environmental deterioration and amelioration, and diagenetic enhancement or obscuration. Ichnofabrics may be assessed by four attributes: primary sedimentary factors, Bioturbation Index (BI), burrow size and frequency, and ichnological diversity. Construction of tier and ichnofabric constituent diagrams aid visualization and comparison. The breaks or changes in colonization and style of tiering at key stratal surfaces accentuate the surfaces, and many reflect a major environmental shift of the trace-forming biota. due to change in hydrodynamic regime (leading to non-deposition and/or erosion and/or lithification), change in salinity regime, or subaerial exposure. The succession of gradational or abrupt changes in ichnofabric through genetically related successions, together with changes in colonization and tiering across event beds, may also be interpreted in terms of changes in environmental parameters. It is not the ichnotaxa per se that are important in discriminating between ichnofabrics, but rather the environmental conditions that determine the overall style of colonization. Fabrics composed of different ichnotaxa (and different taphonomies) but similar tier structure and ichnoguild may form in similar environments of different age or different latitude. Appreciation of colonization and tiering styles places ancient ichnofabrics on a sound processrelated basis for environmental interpretation. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Research in construction management is diverse in content and in quality. There is much to be learned from more fundamental disciplines. Construction is a sub-set of human experience rather than a completely separate phenomenon. Therefore, it is likely that there are few problems in construction requiring the invention of a completely new theory. If construction researchers base their work only on that of other construction researchers, our academic community will become less relevant to the world at large. The theories that we develop or test must be of wider applicability to be of any real interest. In undertaking research, researchers learn a lot about themselves. Perhaps the only difference between research and education is that if we are learning about something which no-one else knows, then it is research, otherwise it is education. Self-awareness of this will help to reduce the chances of publishing work which only reveals a researcher’s own learning curve. Scientific method is not as simplistic as non-scientists claim and is the only real way of overcoming methodological weaknesses in our work. The reporting of research may convey the false impression that it is undertaken in the sequence in which it is written. Construction is not so unique and special as to require a completely different set of methods from other fields of enquiry. Until our research is reported in mainstream journals and conferences, there is little chance that we will influence the wider academic community and a concomitant danger that it will become irrelevant. The most useful insights will come from research which challenges the current orthodoxy rather than research which merely reports it.
Resumo:
The Web's link structure (termed the Web Graph) is a richly connected set of Web pages. Current applications use this graph for indexing and information retrieval purposes. In contrast the relationship between Web Graph and application is reversed by letting the structure of the Web Graph influence the behaviour of an application. Presents a novel Web crawling agent, AlienBot, the output of which is orthogonally coupled to the enemy generation strategy of a computer game. The Web Graph guides AlienBot, causing it to generate a stochastic process. Shows the effectiveness of such unorthodox coupling to both the playability of the game and the heuristics of the Web crawler. In addition, presents the results of the sample of Web pages collected by the crawling process. In particular, shows: how AlienBot was able to identify the power law inherent in the link structure of the Web; that 61.74 per cent of Web pages use some form of scripting technology; that the size of the Web can be estimated at just over 5.2 billion pages; and that less than 7 per cent of Web pages fully comply with some variant of (X)HTML.
Resumo:
This article expresses the price of a spread option as the sum of the prices of two compound options. One compound option is to exchange vanilla call options on the two underlying assets and the other is to exchange the corresponding put options. This way we derive a new closed form approximation for the price of a European spread option and a corresponding approximation for each of its price, volatility and correlation hedge ratios. Our approach has many advantages over existing analytical approximations, which have limited validity and an indeterminacy that renders them of little practical use. The compound exchange option approximation for European spread options is then extended to American spread options on assets that pay dividends or incur costs. Simulations quantify the accuracy of our approach; we also present an empirical application to the American crack spread options that are traded on NYMEX. For illustration, we compare our results with those obtained using the approximation attributed to Kirk (1996, Correlation in energy markets. In: V. Kaminski (Ed.), Managing Energy Price Risk, pp. 71–78 (London: Risk Publications)), which is commonly used by traders.