841 resultados para sub-solutions and super-solutions
Resumo:
The middle-late Campanian was marked by an increase in the bioprovinciality of calcareous microfossil assemblages into distinct Tethyan, Transitional, and Austral Provinces that persisted to the end of the Maastrichtian. The northwestern Australian margin belonged to the Transitional Province and the absence of key Tethyan marker species such as Radotruncana calcarata and Gansserina gansseri has led petroleum companies operating in the area to use the locally developed KCCM integrated calcareous microfossil zonation scheme. The KCCM zonation is a composite scheme comprising calcareous nannofossil (KCN), planktonic foraminiferal (KPF) and benthonic foraminiferal (KBF) zones. This paper presents the definitions and revisions of Zones KCCM8-19, from the highest occurrence (HO) of Aspidolithus parcus constrictus to the lowest occurrence (LO) of Ceratolithoides aculeus, and builds on our previous early-late Maastrichtian study. The presence of a middle-upper Campanian disconformity is confirmed by microfossil evidence from the Vulcan Sub-basin, Exmouth and Wombat plateaus, and the Southern Carnarvon Platform. In the Vulcan Sub-basin and on the Exmouth Plateau (ODP Hole 762C) the hiatus extends from slightly above the LO of common Rugoglobigerina rugosa to above the LO of Quadrum gothicum. On the Wombat Plateau (ODP Hole 761B) it spans from above the LO of Heterohelix semicostata to above the LO of Quadrum gothicum; and in the Southern Carnarvon Platform the disconformity has its longest duration from above the HO of Heterohelix semicostata to above the LO of Quadrum sissinghii. A significant revision of the events which define Zones KCCM18 and 19 was necessary owing to the observation that the LO of Ceratolithoides aculeus occurs below the HOs of Archaeoglobigerina cretacea and Stensioeina granulata incondita and the LO of common Rugoglobigerina rugosa. In the original zonation these events were considered to be coincident.
Resumo:
The ocean plays an important role in modulating the mass balance of the polar ice sheets by interacting with the ice shelves in Antarctica and with the marine-terminating outlet glaciers in Greenland. Given that the flux of warm water onto the continental shelf and into the sub-ice cavities is steered by complex bathymetry, a detailed topography data set is an essential ingredient for models that address ice-ocean interaction. We followed the spirit of the global RTopo-1 data set and compiled consistent maps of global ocean bathymetry, upper and lower ice surface topographies and global surface height on a spherical grid with now 30-arc seconds resolution. We used the General Bathymetric Chart of the Oceans (GEBCO, 2014) as the backbone and added the International Bathymetric Chart of the Arctic Ocean version 3 (IBCAOv3) and the Interna- tional Bathymetric Chart of the Southern Ocean (IBCSO) version 1. While RTopo-1 primarily aimed at a good and consistent representation of the Antarctic ice sheet, ice shelves and sub-ice cavities, RTopo-2 now also contains ice topographies of the Greenland ice sheet and outlet glaciers. In particular, we aimed at a good representation of the fjord and shelf bathymetry sur- rounding the Greenland continent. We corrected data from earlier gridded products in the areas of Petermann Glacier, Hagen Bræ and Sermilik Fjord assuming that sub-ice and fjord bathymetries roughly follow plausible Last Glacial Maximum ice flow patterns. For the continental shelf off northeast Greenland and the floating ice tongue of Nioghalvfjerdsfjorden Glacier at about 79°N, we incorporated a high-resolution digital bathymetry model considering original multibeam survey data for the region. Radar data for surface topographies of the floating ice tongues of Nioghalvfjerdsfjorden Glacier and Zachariæ Isstrøm have been obtained from the data centers of Technical University of Denmark (DTU), Operation Icebridge (NASA/NSF) and Alfred Wegener Institute (AWI). For the Antarctic ice sheet/ice shelves, RTopo-2 largely relies on the Bedmap-2 product but applies corrections for the geometry of Getz, Abbot and Fimbul ice shelf cavities.
Resumo:
The Belgica Trough and the adjacent Belgica Trough Mouth Fan in the southern Bellingshausen Sea (Pacific sector of the Southern Ocean) mark the location of a major outlet for the West Antarctic Ice Sheet during the Late Quaternary. The drainage basin of an ice stream that advanced through Belgica Trough across the shelf during the last glacial period comprised an area exceeding 200,000 km**2 in the West Antarctic hinterland. Previous studies, mainly based on marine-geophysical data from the continental shelf and slope, focused on the bathymetry and seafloor bedforms, and the reconstruction of associated depositional processes and ice- drainage patterns. In contrast, there was only sparse information from seabed sediments recovered by coring. In this paper, we present lithological and clay mineralogical data of 21 sediment cores collected from the shelf and slope of the southern Bellingshausen Sea. Most cores recovered three lithological units, which can be attributed to facies types deposited under glacial, transitional and seasonally open-marine conditions. The clay mineral assemblages document coinciding changes in provenance. The relationship between the clay mineral assemblages in the subglacial and proglacial sediments on the shelf and the glacial diamictons on the slope confirms that a grounded ice stream advanced through Belgica Trough to the shelf break during the past, thereby depositing detritus eroded in the West Antarctic hinterland as soft till on the shelf and as glaciogenic debris flows on the slope. The thinness of the transitional and seasonally open-marine sediments in the cores suggests that this ice advance occurred during the last glacial period. Clay mineralogical, acoustic sub-bottom and seismic data furthermore demonstrate that the palaeo-ice stream probably reworked old sedimentary strata, including older tills, on the shelf and incorporated this debris into its till bed. The geographical heterogeneity of the clay mineral assemblages in the sub- and proglacial diamictons and gravelly deposits indicates that they were eroded from underlying sedimentary strata of different ages. These strata may have been deposited during either different phases of the last glacial period or different glacial and interglacial periods. Additionally, the clay mineralogical heterogeneity of the soft tills recovered on the shelf suggests that the drainage area of the palaeo-ice stream flowing through Belgica Trough changed through time.
Resumo:
The cold upwelling 'tongue' of the eastern equatorial Pacific is a central energetic feature of the ocean, dominating both the mean state and temporal variability of climate in the tropics and beyond. Recent evidence for the development of the modern cold tongue during the Pliocene-Pleistocene transition has been explained as the result of extratropical cooling that drove a shoaling of the thermocline. We have found that the sub-Antarctic and sub-Arctic regions underwent substantial cooling nearly synchronous to the cold tongue development, thereby providing support for this hypothesis. In addition, we show that sub-Antarctic climate changed in its response to Earth's orbital variations, from a subtropical to a subpolar pattern, as expected if cooling shrank the warm-water sphere of the ocean and thus contracted the subtropical gyres.
Resumo:
Projected air and ground temperatures are expected to be higher in Arctic and sub-Arcticlatitudes and with temperatures already close to the limit where permafrost can exist,resistance against degradation is low. With thawing permafrost, the landscape is modifiedwith depression in which thermokarst lakes emerge. In permafrost soils a considerableamount of soil organic carbon is stored, with the potential of altering climate even furtherif expansion and formation of new thermokarst lakes emerge, as decay releasesgreenhouse gases (C02 and CH4) to the atmosphere. Analyzing the spatial distribution andmorphometry over time of thermokarst lakes and other water bodies, is of importance inaccurately predict carbon budget and feedback mechanisms, as well as to assess futurelandscape layout and these features interaction. Different types of high-spatial resolutionaerial and satellite imageries from 1963, 1975, 2003, 2010 and 2015, were used in bothpre- and post-classification change detection analyses. Using object oriented segmentationin eCognition combined with manual adjustments, resulted in digitalized water bodies>28m2 from which direction of change and morphometric values were extracted. Thequantity of thermokarst lakes and other water bodies was in 1963 n=92, with succeedingyears as a trend decreased in numbers, until 2010-2015 when eleven water bodies wereadded in 2015 (n=74 to n=85). In 1963-2003, area of these water bodies decreased with50 651m2 (189 446-138 795m2) and continued to decrease in 2003-2015 ending at 129337m2. Limnicity decreased from 19.9% in 1963 to 14.6% in 2003 (-5.3%). In 2010 and2015 13.7-13.6%. The late increase in water bodies differs from an earlier hypothesis thatsporadic permafrost regions experience decrease in both area and quantity of thermokarstlakes and water bodies. During 1963-2015, land gain has been in dominance of the ratiobetween the two competing processes of expansion and drainage. In 1963-1975, 55/45%,followed by 90/10% in 1975-2003. After major drainage events, land loss increased to62/38% in 2010-2015. Drainage and infilling rates, calculated for 15 shorelines werevaried across both landscape and parts of shorelines, with in average 0.17/0.15/0.14m/yr.Except for 1963-1975 when rate of change in average was in opposite direction (-0.09m/yr.), likely due to evident expansion of a large thermokarst lake. Using a squaregrid, distribution of water bodies was determined, with an indistinct cluster located in NEand central parts. Especially for water bodies <250m2, which is the dominant area classthroughout 1963-2015 ranging from n=39-51. With a heterogeneous composition of bothsmall and large thermokarst lakes, and with both expansion and drainage altering thelandscape in Tavvavuoma, both positive and negative climate feedback mechanisms are inplay - given that sporadic permafrost still exist.
Resumo:
Sustainable management of coastal and coral reef environments requires regular collection of accurate information on recognized ecosystem health indicators. Satellite image data and derived maps of water column and substrate biophysical properties provide an opportunity to develop baseline mapping and monitoring programs for coastal and coral reef ecosystem health indicators. A significant challenge for satellite image data in coastal and coral reef water bodies is the mixture of both clear and turbid waters. A new approach is presented in this paper to enable production of water quality and substrate cover type maps, linked to a field based coastal ecosystem health indicator monitoring program, for use in turbid to clear coastal and coral reef waters. An optimized optical domain method was applied to map selected water quality (Secchi depth, Kd PAR, tripton, CDOM) and substrate cover type (seagrass, algae, sand) parameters. The approach is demonstrated using commercially available Landsat 7 Enhanced Thematic Mapper image data over a coastal embayment exhibiting the range of substrate cover types and water quality conditions commonly found in sub-tropical and tropical coastal environments. Spatially extensive and quantitative maps of selected water quality and substrate cover parameters were produced for the study site. These map products were refined by interactions with management agencies to suit the information requirements of their monitoring and management programs. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
We prove upper and lower bounds relating the quantum gate complexity of a unitary operation, U, to the optimal control cost associated to the synthesis of U. These bounds apply for any optimal control problem, and can be used to show that the quantum gate complexity is essentially equivalent to the optimal control cost for a wide range of problems, including time-optimal control and finding minimal distances on certain Riemannian, sub-Riemannian, and Finslerian manifolds. These results generalize the results of [Nielsen, Dowling, Gu, and Doherty, Science 311, 1133 (2006)], which showed that the gate complexity can be related to distances on a Riemannian manifold.
Resumo:
Esta pesquisa desenvolve-se na área da Sociologia da Religião e visa apresentar, por meio de levantamentos bibliográficos, históricos e de campo, a transformação da relação entre o sub-campo protestante brasileiro e o esporte, em especial o futebol. Com isso pretende-se demonstrar que a gradual aceitação do esporte em geral e do futebol em particular, nos meios protestantes brasileiros, foi conseqüência do aprofundamento do processo de secularização ocorrido na sociedade brasileira durante o século XX. A aceitação também se deu por conseqüência do desencantamento gradual do sub-campo protestante. Contribuiu para esse processo a dessacralização do tempo, o domingo, o que demonstra uma transformação da cosmovisão protestante. A crescente profissionalização do futebol, cada vez mais entendido como uma atividade legítima pelos protestantes, também contribuiu para sua aceitação. Com um estudo de caso, esta pesquisa apresenta e analisa o grupo que se autodenomina Atletas de Cristo , exemplo da transformação tanto do sub-campo religioso protestante em sua relação com o esporte, quanto das transformações do próprio campo esportivo. Para a compreensão da religião dos Atletas de Cristo, é desenvolvida a etnografia do grupo, a análise de sua literatura própria, de suas relações com os atletas-não-de-Cristo , bem como a sua sacralização do esporte.(AU)
Resumo:
Editorial: The contributions to this special issue of the International Journal of Technology Management are all based on selected papers presented at the European Conference on Management of Technology held at Aston University, Birmingham, UK in June 1995. This conference was held on behalf of the International Association for Management of Technology (IAMOT) and was the first of the association’s major conferences to be held outside North America. The overall theme of the conference was ‘Technological Innovation and Global Challenges’. Altogether more than 130 papers were presented within four sub-themes and twenty seven topic sessions. This special issue draws on papers within five difference topic sessions: ‘Small firm linkages’; ‘The global company’; ‘New technology based firms’; ‘Financing innovation’; ‘Technology and development’. Together they cover a wide range of issues around the common question of accessing resources for innovation in small and medium sized enterprises. They present a global perspective on this important subject with authors from The Netherlands, Canada, USA, Ireland, France, Finland, Brazil and UK. A wide range of subjects are covered including the move away from public support for innovation, the role of alliances and networks, linkages to larger enterprises and the social implications associated with small enterprise innovation in developing countries.
Resumo:
The subject of this thesis is the n-tuple net.work (RAMnet). The major advantage of RAMnets is their speed and the simplicity with which they can be implemented in parallel hardware. On the other hand, this method is not a universal approximator and the training procedure does not involve the minimisation of a cost function. Hence RAMnets are potentially sub-optimal. It is important to understand the source of this sub-optimality and to develop the analytical tools that allow us to quantify the generalisation cost of using this model for any given data. We view RAMnets as classifiers and function approximators and try to determine how critical their lack of' universality and optimality is. In order to understand better the inherent. restrictions of the model, we review RAMnets showing their relationship to a number of well established general models such as: Associative Memories, Kamerva's Sparse Distributed Memory, Radial Basis Functions, General Regression Networks and Bayesian Classifiers. We then benchmark binary RAMnet. model against 23 other algorithms using real-world data from the StatLog Project. This large scale experimental study indicates that RAMnets are often capable of delivering results which are competitive with those obtained by more sophisticated, computationally expensive rnodels. The Frequency Weighted version is also benchmarked and shown to perform worse than the binary RAMnet for large values of the tuple size n. We demonstrate that the main issues in the Frequency Weighted RAMnets is adequate probability estimation and propose Good-Turing estimates in place of the more commonly used :Maximum Likelihood estimates. Having established the viability of the method numerically, we focus on providillg an analytical framework that allows us to quantify the generalisation cost of RAMnets for a given datasetL. For the classification network we provide a semi-quantitative argument which is based on the notion of Tuple distance. It gives a good indication of whether the network will fail for the given data. A rigorous Bayesian framework with Gaussian process prior assumptions is given for the regression n-tuple net. We show how to calculate the generalisation cost of this net and verify the results numerically for one dimensional noisy interpolation problems. We conclude that the n-tuple method of classification based on memorisation of random features can be a powerful alternative to slower cost driven models. The speed of the method is at the expense of its optimality. RAMnets will fail for certain datasets but the cases when they do so are relatively easy to determine with the analytical tools we provide.
Resumo:
The aim of this study is to determine if nonlinearities have affected purchasing power parity (PPP) since 1885. Also using recent advances in the econometrics of structural change we segment the sample space according to the identified breaks and look at whether the PPP condition holds in each sub-sample and whether this involves linear or non-linear adjustment. Our results suggest that during some sub-periods, PPP holds, although whether it holds or not and whether the adjustment is linear or non-linear, depends primarily on the type of exchange rate regime in operation at any point in time.
Resumo:
Magnesian limestone is a key construction component of many historic buildings that is under constant attack from environmental pollutants notably by oxides of sulfur via acid rain, particulate matter sulfate and gaseous SO 2 emissions. Hydrophobic surface coatings offer a potential route to protect existing stonework in cultural heritage sites, however, many available coatings act by blocking the stone microstructure, preventing it from 'breathing' and promoting mould growth and salt efflorescence. Here we report on a conformal surface modification method using self-assembled monolayers of naturally sourced free fatty acids combined with sub-monolayer fluorinated alkyl silanes to generate hydrophobic (HP) and super hydrophobic (SHP) coatings on calcite. We demonstrate the efficacy of these HP and SHP surface coatings for increasing limestone resistance to sulfation, and thus retarding gypsum formation under SO/H O and model acid rain environments. SHP treatment of 19th century stone from York Minster suppresses sulfuric acid permeation.
Resumo:
This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.
Resumo:
One in five adults 65 years and older has diabetes. Coping with diabetes is a lifelong task, and much of the responsibility for managing the disease falls upon the individual. Reports of non-adherence to recommended treatments are high. Understanding the additive impact of diabetes on quality of life issues is important. The purpose of this study was to investigate the quality of life and diabetes self-management behaviors in ethnically diverse older adults with type 2 diabetes. The SF-12v2 was used to measure physical and mental health quality of life. Scores were compared to general, age sub-groups, and diabetes-specific norms. The Transtheoretical Model (TTM) was applied to assess perceived versus actual behavior for three diabetes self-management tasks: dietary management, medication management, and blood glucose self-monitoring. Dietary intake and hemoglobin A1c values were measured as outcome variables. Utilizing a cross-sectional research design, participants were recruited from Elderly Nutrition Program congregate meal sites (n = 148, mean age 75). ^ Results showed that mean scores of the SF-12v2 were significantly lower in the study sample than the general norms for physical health (p < .001), mental health (p < .01), age sub-group norms (p < .05), and diabetes-specific norms for physical health (p < .001). A multiple regression analysis found that adherence to an exercise plan was significantly associated with better physical health (p < .001). Transtheoretical Model multiple regression analyses explained 68% of the variance for % Kcal from fat, 41% for fiber, 70% for % Kcal from carbohydrate, and 7% for hemoglobin A 1c values. Significant associations were found between TTM stage of change and dietary fiber intake (p < .01). Other significant associations related to diet included gender (p < .01), ethnicity (p < .05), employment (p < .05), type of insurance (p < .05), adherence to an exercise plan (p < .05), number of doctor visits/year ( p < .01), and physical health (p < .05). Significant associations were found between hemoglobin A1c values and age ( p < .05), being non-Hispanic Black (p < .01), income (p < .01), and eye problems (p < .05). ^ The study highlights the importance of the beneficial effects of exercise on quality of life issues. Furthermore, application of the Transtheoretical Model in conjunction with an assessment of dietary intake may be valuable in helping individuals make lifestyle changes. ^
Resumo:
Compressional- and shear-wave velocity logs (Vp and Vs, respectively) that were run to a sub-basement depth of 1013 m (1287.5 m sub-bottom) in Hole 504B suggest the presence of Layer 2A and document the presence of layers 2B and 2C on the Costa Rica Rift. Layer 2A extends from the mudline to 225 m sub-basement and is characterized by compressional-wave velocities of 4.0 km/s or less. Layer 2B extends from 225 to 900 m and may be divided into two intervals: an upper level from 225 to 600 m in which Vp decreases slowly from 5.0 to 4.8 km/s and a lower level from 600 to about 900 m in which Vp increases slowly to 6.0 km/s. In Layer 2C, which was logged for about 100 m to a depth of 1 km, Vp and Vs appear to be constant at 6.0 and 3.2 km/s, respectively. This velocity structure is consistent with, but more detailed than the structure determined by the oblique seismic experiment in the same hole. Since laboratory measurements of the compressional- and shear-wave velocity of samples from Hole 504B at Pconfining = Pdifferential average 6.0 and 3.2 km/s respectively, and show only slight increases with depth, we conclude that the velocity structure of Layer 2 is controlled almost entirely by variations in porosity and that the crack porosity of Layer 2C approaches zero. A comparison between the compressional-wave velocities determined by logging and the formation porosities calculated from the results of the large-scale resistivity experiment using Archie's Law suggest that the velocity- porosity relation derived by Hyndman et al. (1984) for laboratory samples serves as an upper bound for Vp, and the noninteractive relation derived by Toksöz et al. (1976) for cracks with an aspect ratio a = 1/32 serves as a lower bound.