913 resultados para coral reef complexity
Resumo:
The author studies the error and complexity of the discrete random walk Monte Carlo technique for radiosity, using both the shooting and gathering methods. The author shows that the shooting method exhibits a lower complexity than the gathering one, and under some constraints, it has a linear complexity. This is an improvement over a previous result that pointed to an O(n log n) complexity. The author gives and compares three unbiased estimators for each method, and obtains closed forms and bounds for their variances. The author also bounds the expected value of the mean square error (MSE). Some of the results obtained are also shown
Resumo:
This doctoral thesis is the result of the experience, continuous practice and day to day work as head of infant and youth choirs that the candidate has developed for over more than thirty years. The motivation, the concerns and the questions formulated with regards to choral singing, have underpinned this research. The thesis has focused on the learning of values, habits and competences in order to demonstrate that singing in an infant or youth choir is educational and goes far beyond the mere act of singing in a choir. We can therefore state that education is continuous, it never comes to an end and, furthermore, it unfolds in divergent ways. In summary to show that singing in an infant or youth choir is educational in the widest possible sense.
Resumo:
This paper reviews speechreading and the effect of sentence length and linguistic complexity on deaf children.
Resumo:
Deposits of coral-bearing, marine shell conglomerate exposed at elevations higher than 20 m above present-day mean sea level (MSL) in Bermuda and the Bahamas have previously been interpreted as relict intertidal deposits formed during marine isotope stage (MIS) I I, ca. 360-420 ka before present. On the strength of this evidence, a sea level highstand more than 20 m higher than present-day MSL was inferred for the MIS I I interglacial, despite a lack of clear supporting evidence in the oxygen-isotope records of deep-sea sediment cores. We have critically re-examined the elevated marine deposits in Bermuda, and find their geological setting, sedimentary relations, and microfaunal assemblages to be inconsistent with intertidal deposition over an extended period. Rather, these deposits, which comprise a poorly sorted mixture of reef, lagoon and shoreline sediments, appear to have been carried tens of meters inside karst caves, presumably by large waves, at some time earlier than ca. 310-360 ka before present (MIS 9-11). We hypothesize that these deposits are the result of a large tsunami during the mid-Pleistocene, in which Bermuda was impacted by a wave set that carried sediments from the surrounding reef platform and nearshore waters over the eolianite atoll. Likely causes for such a megatsunami are the flank collapse of an Atlantic island volcano, such as the roughly synchronous Julan or Orotava submarine landslides in the Canary Islands, or a giant submarine landslide on the Atlantic continental margin. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.
Resumo:
Accuracy and mesh generation are key issues for the high-resolution hydrodynamic modelling of the whole Great Barrier Reef. Our objective is to generate suitable unstructured grids that can resolve topological and dynamical features like tidal jets and recirculation eddies in the wake of islands. A new strategy is suggested to refine the mesh in areas of interest taking into account the bathymetric field and an approximated distance to islands and reefs. Such a distance is obtained by solving an elliptic differential operator, with specific boundary conditions. Meshes produced illustrate both the validity and the efficiency of the adaptive strategy. Selection of refinement and geometrical parameters is discussed. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
1. Although the importance of plant community assemblages in structuring invertebrate assemblages is well known, the role that architectural complexity plays is less well understood. In particular, direct empirical data for a range of invertebrate taxa showing how functional groups respond to plant architecture is largely absent from the literature. 2. The significance of sward architectural complexity in determining the species richness of predatory and phytophagous functional groups of spiders, beetles, and true bugs, sampled from 135 field margin plots over 2 years was tested. The present study compares the relative importance of sward architectural complexity to that of plant community assemblage. 3. Sward architectural complexity was found to be a determinant of species richness for all phytophagous and predatory functional groups. When individual species responses were investigated, 62.5% of the spider and beetle species, and 50.0% of the true bugs responded to sward architectural complexity. 4. Interactions between sward architectural complexity and plant community assemblage indicate that the number of invertebrate species supported by the plant community alone could be increased by modification of sward architecture. Management practices could therefore play a key role in diversifying the architectural structure of existing floral assemblages for the benefit of invertebrate assemblages. 5. The contrasting effects of sward architecture on invertebrate functional groups characterised by either direct (phytophagous species) or indirect (predatory species) dependence on plant communities is discussed. It is suggested that for phytophagous taxa, plant community assemblage alone is likely to be insufficient to ensure successful species colonisation or persistence without appropriate development of sward architecture.
Resumo:
In models of complicated physical-chemical processes operator splitting is very often applied in order to achieve sufficient accuracy as well as efficiency of the numerical solution. The recently rediscovered weighted splitting schemes have the great advantage of being parallelizable on operator level, which allows us to reduce the computational time if parallel computers are used. In this paper, the computational times needed for the weighted splitting methods are studied in comparison with the sequential (S) splitting and the Marchuk-Strang (MSt) splitting and are illustrated by numerical experiments performed by use of simplified versions of the Danish Eulerian model (DEM).
Resumo:
In this work we study the computational complexity of a class of grid Monte Carlo algorithms for integral equations. The idea of the algorithms consists in an approximation of the integral equation by a system of algebraic equations. Then the Markov chain iterative Monte Carlo is used to solve the system. The assumption here is that the corresponding Neumann series for the iterative matrix does not necessarily converge or converges slowly. We use a special technique to accelerate the convergence. An estimate of the computational complexity of Monte Carlo algorithm using the considered approach is obtained. The estimate of the complexity is compared with the corresponding quantity for the complexity of the grid-free Monte Carlo algorithm. The conditions under which the class of grid Monte Carlo algorithms is more efficient are given.