932 resultados para large-scale structure of the universe


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Southern Ocean ecosystem at the Antarctic Peninsula has steep natural environmental gradients, e.g. in terms of water masses and ice cover, and experiences regional above global average climate change. An ecological macroepibenthic survey was conducted in three ecoregions in the north-western Weddell Sea, on the continental shelf of the Antarctic Peninsula in the Bransfield Strait and on the shelf of the South Shetland Islands in the Drake Passage, defined by their environmental envelop. The aim was to improve the so far poor knowledge of the structure of this component of the Southern Ocean ecosystem and its ecological driving forces. It can also provide a baseline to assess the impact of ongoing climate change to the benthic diversity, functioning and ecosystem services. Different intermediate-scaled topographic features such as canyon systems including the corresponding topographically defined habitats 'bank', 'upper slope', 'slope' and 'canyon/deep' were sampled. In addition, the physical and biological environmental factors such as sea-ice cover, chlorophyll-a concentration, small-scale bottom topography and water masses were analysed. Catches by Agassiz trawl showed high among-station variability in biomass of 96 higher systematic groups including ecological key taxa. Large-scale patterns separating the three ecoregions from each other could be correlated with the two environmental factors, sea-ice and depth. Attribution to habitats only poorly explained benthic composition, and small-scale bottom topography did not explain such patterns at all. The large-scale factors, sea-ice and depth, might have caused large-scale differences in pelagic benthic coupling, whilst small-scale variability, also affecting larger scales, seemed to be predominantly driven by unknown physical drivers or biological interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Dickson Land peninsula is located in central West-Spitsbergen between the NNE branches of Isfjorden. The climatic firn line lying at 500 m causes plateau glaciers with outlet tongues which are characteristic of S-Dickson Land. The distribution of valley glaciers and the variations of the orographic firn line depend on wind direction. In comparing the firn lines established by the methods of LICHTENECKER (1938) and VISSER (1938), to the values calculated by the method of v. HÖFER (1879), differences of up to l07 m are found. These differences may depend on the inclination and distance relationships of the glaciers above and below the real firn lines. During the latest glacial advance, Dickson Land was located on the peripheries of two local glaciation centers. At that time an inland glaciation of West-Spitsbergen did not exist . The formation of a subglacial channel system dates back to the maximum extent of the late glacial phase before 17500 B.P, (+2000/-1375 years). A correlation of postglacial stadia and 14C dated marine terraces (FEYLING-HANSSEN & OLSSON, 1960; FEYLING-HANSSEN, 1965) is possible. Considering isostatic movement and the difference between calculated and real firn lines, a postglacial stadium at about 10400 B. P. can be reconstructed with a firn line lying 265 m above former sea level. On average, the absolute depression below the recent firn line amounted to 246 m. Stagnation at 9650 B.P. coincided with a firn line at 315 m above former sea level and a depression of 173 m. Around 1890 A.D., glacial fluctuations corresponded to a firn line at 415 m (depression: 64 m). To some extent the morphology of the main valleys appears to depend on structure and petrography. Therefore their value as indicators of former glaciations is questionable. The periglacial forms are shown on a large-scale map. At the time of the "Holocene warm interval", between 7000 and 2000 B.P. (FEYLING-HANSSEN, 1955a, 1965), an increase of periglacial activity seems likely. This can be explained by a simultaneous increase in the depth of the active layer in both soil and bedrock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous studies about the strength of the lithosphere in the Iberia centre fail to resolve the depth of earthquakes because of the rheological uncertainties. Therefore, new contributions are considered (the crustal structure from a density model) and several parameters (tectonic regime, mantle rheology, strain rate) are checked in this paper to properly examine the role of lithospheric strength in the intraplate seismicity and the Cenozoic evolution. The strength distribution with depth, the integrated strength, the effective elastic thickness and the seismogenic thickness have been calculated by a finite element modelling of the lithosphere across the Central System mountain range and the bordering Duero and Madrid sedimentary basins. Only a dry mantle under strike-slip/extension and a strain rate of 10-15 s-1, or under extension and 10-16 s-1, causes a strong lithosphere. The integrated strength and the elastic thickness are lower in the mountain chain than in the basins. These anisotropies have been maintained since the Cenozoic and determine the mountain uplift and the biharmonic folding of the Iberian lithosphere during the Alpine deformations. The seismogenic thickness bounds the seismic activity in the upper–middle crust, and the decreasing crustal strength from the Duero Basin towards the Madrid Basin is related to a parallel increase in Plio–Quaternary deformations and seismicity. However, elasto–plastic modelling shows that current African–Eurasian convergence is resolved elastically or ductilely, which accounts for the low seismicity recorded in this region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The time-mean Argo float displacements and the World Ocean Atlas 2009 temperature–salinity climatology are used to obtain the total, top to bottom, mass transports. Outside of an equatorial band, the total transports are the sum of the vertical integrals of geostrophic- and wind-driven Ekman currents. However, these transports are generally divergent, and to obtain a mass conserving circulation, a Poisson equation is solved for the streamfunction with Dirichlet boundary conditions at solid boundaries. The value of the streamfunction on islands is also part of the unknowns. This study presents and discusses an energetic circulation in three basins: the North Atlantic, the North Pacific, and the Southern Ocean. This global method leads to new estimations of the time-mean western Eulerian boundary current transports maxima of 97 Sverdrups (Sv; 1 Sv ≡ 106 m3 s−1) at 60°W for the Gulf Stream, 84 Sv at 157°E for the Kuroshio, 80 Sv for the Agulhas Current between 32° and 36°S, and finally 175 Sv for the Antarctic Circumpolar Current at Drake Passage. Although the large-scale structure and boundary of the interior gyres is well predicted by the Sverdrup relation, the transports derived from the wind stress curl are lower than the observed transports in the interior by roughly a factor of 2, suggesting an important contribution of the bottom torques. With additional Argo displacement data, the errors caused by the presence of remaining transient terms at the 1000-db reference level will continue to decrease, allowing this method to produce increasingly accurate results in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

People go through their life making all kinds of decisions, and some of these decisions affect their demand for transportation, for example, their choices of where to live and where to work, how and when to travel and which route to take. Transport related choices are typically time dependent and characterized by large number of alternatives that can be spatially correlated. This thesis deals with models that can be used to analyze and predict discrete choices in large-scale networks. The proposed models and methods are highly relevant for, but not limited to, transport applications. We model decisions as sequences of choices within the dynamic discrete choice framework, also known as parametric Markov decision processes. Such models are known to be difficult to estimate and to apply to make predictions because dynamic programming problems need to be solved in order to compute choice probabilities. In this thesis we show that it is possible to explore the network structure and the flexibility of dynamic programming so that the dynamic discrete choice modeling approach is not only useful to model time dependent choices, but also makes it easier to model large-scale static choices. The thesis consists of seven articles containing a number of models and methods for estimating, applying and testing large-scale discrete choice models. In the following we group the contributions under three themes: route choice modeling, large-scale multivariate extreme value (MEV) model estimation and nonlinear optimization algorithms. Five articles are related to route choice modeling. We propose different dynamic discrete choice models that allow paths to be correlated based on the MEV and mixed logit models. The resulting route choice models become expensive to estimate and we deal with this challenge by proposing innovative methods that allow to reduce the estimation cost. For example, we propose a decomposition method that not only opens up for possibility of mixing, but also speeds up the estimation for simple logit models, which has implications also for traffic simulation. Moreover, we compare the utility maximization and regret minimization decision rules, and we propose a misspecification test for logit-based route choice models. The second theme is related to the estimation of static discrete choice models with large choice sets. We establish that a class of MEV models can be reformulated as dynamic discrete choice models on the networks of correlation structures. These dynamic models can then be estimated quickly using dynamic programming techniques and an efficient nonlinear optimization algorithm. Finally, the third theme focuses on structured quasi-Newton techniques for estimating discrete choice models by maximum likelihood. We examine and adapt switching methods that can be easily integrated into usual optimization algorithms (line search and trust region) to accelerate the estimation process. The proposed dynamic discrete choice models and estimation methods can be used in various discrete choice applications. In the area of big data analytics, models that can deal with large choice sets and sequential choices are important. Our research can therefore be of interest in various demand analysis applications (predictive analytics) or can be integrated with optimization models (prescriptive analytics). Furthermore, our studies indicate the potential of dynamic programming techniques in this context, even for static models, which opens up a variety of future research directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Flexibility of Responses to Self-Critical Thoughts Scale (FoReST) is a questionnaire that was developed to assess whether people can be psychologically flexible when experiencing critical thoughts about themselves. This measure could have important application for evaluating third wave therapies such as Acceptance and Commitment Therapy (ACT) and Compassion Focused therapy (CFT). This study investigated the validity (concurrent, predictive and incremental), internal consistency and factor structure of the FoReST in a sample of people experiencing mental health difficulties. Method: A total of 132 individuals attending Primary Care and Community Mental Health Teams within NHS Greater Glasgow and Clyde (NHS GGC) and Psychological Therapy Teams within NHS Lanarkshire participated in this study. Participants completed a battery of assessments that included the FoReST and related measures of similar constructs (psychological flexibility, self-compassion and self-criticism) and measures of mental health and well-being. A cross-sectional correlational design was used. Results: An Exploratory factor analysis described an interpretable 2-factor structure within the items of the FoReST: unworkable action and experiential avoidance. The FoReST demonstrated good internal consistency ( = .89). Concurrent validity was supported through moderate to strong correlations with similar measures and moderate correlations with other mental health and well-being outcomes. Conclusions: The FoReST appears to be a valid assessment measure for using with individuals experiencing mental health difficulties. This new measure will be of use for practitioners using ACT, CFT and those integrating both, to help monitor the process of change in flexibility and self-critical thinking across therapy. Further longitudinal studies are required to assess the test-retest reliability of the FoReST.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance and economic indicators of a large scale fish farm that produces round fish, located in Mato Grosso State, Brazil, were evaluated. The 130.8 ha-water surface area was distributed in 30 ponds. Average total production costs and the following economic indicators were calculated: gross income (GI), gross margin (GM), gross margin index (GMI), profitability index (PI) and profit (P) for the farm as a whole and for ten ponds individually. Production performance indicators were also obtained, such as: production cycle (PC), apparent feed conversion (FC), average biomass storage (ABS), survival index (SI) and final average weight (FAW). The average costs to produce an average 2.971 kg.ha-1 per year were: R$ 2.43, R$ 0.72 and R$ 3.15 as average variable, fixed and total costs, respectively. Gross margin and profit per year per hectare of water surface were R$ 2,316.91 and R$ 180.98, respectively. The individual evaluation of the ponds showed that the best pond performance was obtained for PI 38%, FC 1.7, ABS 0.980 kg.m-2, TS 56%, FAW 1.873 kg with PC of 12.3 months. The worst PI was obtained for the pond that displayed losses of 138%, FC 2.6, ABS 0.110 kg.m-2, SI 16% and FAW 1.811 kg. However, large scale production of round-fish in farms is economically feasible. The studied farm displays favorable conditions to improve performance and economic indicators, but it is necessary to reproduce the breeding techniques and performance indicators achieved in few ponds to the entire farm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis has extensively investigated for the first time the statistical distributions of atmospheric surface variables and heat fluxes for the Mediterranean Sea. After retrieving a 30-year atmospheric analysis dataset, we have captured the spatial patterns of the probability distribution of the relevant atmospheric variables for ocean atmospheric forcing: wind components (U,V), wind amplitude, air temperature (T2M), dewpoint temperature (D2M) and mean sea-level pressure (MSL-P). The study reveals that a two-parameter PDF is not a good fit for T2M, D2M, MSL-P and wind components (U,V) and a three parameter skew-normal PDF is better suited. Such distribution captures properly the data asymmetric tails (skewness). After removing the large seasonal cycle, we show the quality of the fit and the geographic structure of the PDF parameters. It is found that the PDF parameters vary between different regions, in particular the shape (connected to the asymmetric tails) and the scale (connected to the spread of the distribution) parameters cluster around two or more values, probably connected to the different dynamics that produces the surface atmospheric fields in the Mediterranean basin. Moreover, using the atmospheric variables, we have computed the air-sea heat fluxes for a 20-years period and estimated the net heat budget over the Mediterranean Sea. Interestingly, the higher resolution analysis dataset provides a negative heat budget of –3 W/m2 which is within the acceptable range for the Mediterranean Sea heat budget closure. The lower resolution atmospheric reanalysis dataset(ERA5) does not satisfy the heat budget closure problem pointing out that a minimal resolution of the atmospheric forcing is crucial for the Mediterranean Sea dynamics. The PDF framework developed in this thesis will be the basis for a future ensemble forecasting system that will use the statistical distributions to create perturbations of the atmospheric ocean forcing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fungus Metarhizium anisopliae is used on a large scale in Brazil as a microbial control agent against the sugar cane spittlebugs, Mahanarva posticata and M. fimbriolata (Hemiptera., Cercopidae). We applied strain E9 of M. anisopliae in a bioassay on soil, with field doses of conidia to determine if it can cause infection, disease and mortality in immature stages of Anastrepha fraterculus, the South American fruit fly. All the events were studied histologically and at the molecular level during the disease cycle, using a novel histological technique, light green staining, associated with light microscopy, and by PCR, using a specific DNA primer developed for M. anisopliae capable to identify Brazilian strains like E9. The entire infection cycle, which starts by conidial adhesion to the cuticle of the host, followed by germination with or without the formation of an appressorium, penetration through the cuticle and colonisation, with development of a dimorphic phase, hyphal bodies in the hemocoel, and death of the host, lasted 96 hours under the bioassay conditions, similar to what occurs under field conditions. During the disease cycle, the propagules of the entomopathogenic fungus were detected by identifying DNA with the specific primer ITSMet: 5' TCTGAATTTTTTATAAGTAT 3' with ITS4 (5' TCCTCCGCTTATTGATATGC 3') as a reverse primer. This simple methodology permits in situ studies of the infective process, contributing to our understanding of the host-pathogen relationship and allowing monitoring of the efficacy and survival of this entomopathogenic fungus in large-scale applications in the field. It also facilitates monitoring the environmental impact of M. anisopliae on non-target insects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss the dynamics of the Universe within the framework of the massive graviton cold dark matter scenario (MGCDM) in which gravitons are geometrically treated as massive particles. In this modified gravity theory, the main effect of the gravitons is to alter the density evolution of the cold dark matter component in such a way that the Universe evolves to an accelerating expanding regime, as presently observed. Tight constraints on the main cosmological parameters of the MGCDM model are derived by performing a joint likelihood analysis involving the recent supernovae type Ia data, the cosmic microwave background shift parameter, and the baryonic acoustic oscillations as traced by the Sloan Digital Sky Survey red luminous galaxies. The linear evolution of small density fluctuations is also analyzed in detail. It is found that the growth factor of the MGCDM model is slightly different (similar to 1-4%) from the one provided by the conventional flat Lambda CDM cosmology. The growth rate of clustering predicted by MGCDM and Lambda CDM models are confronted to the observations and the corresponding best fit values of the growth index (gamma) are also determined. By using the expectations of realistic future x-ray and Sunyaev-Zeldovich cluster surveys we derive the dark matter halo mass function and the corresponding redshift distribution of cluster-size halos for the MGCDM model. Finally, we also show that the Hubble flow differences between the MGCDM and the Lambda CDM models provide a halo redshift distribution departing significantly from the those predicted by other dark energy models. These results suggest that the MGCDM model can observationally be distinguished from Lambda CDM and also from a large number of dark energy models recently proposed in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss the properties of homogeneous and isotropic flat cosmologies in which the present accelerating stage is powered only by the gravitationally induced creation of cold dark matter (CCDM) particles (Omega(m) = 1). For some matter creation rates proposed in the literature, we show that the main cosmological functions such as the scale factor of the universe, the Hubble expansion rate, the growth factor, and the cluster formation rate are analytically defined. The best CCDM scenario has only one free parameter and our joint analysis involving baryonic acoustic oscillations + cosmic microwave background (CMB) + SNe Ia data yields (Omega) over tilde = 0.28 +/- 0.01 (1 sigma), where (Omega) over tilde (m) is the observed matter density parameter. In particular, this implies that the model has no dark energy but the part of the matter that is effectively clustering is in good agreement with the latest determinations from the large- scale structure. The growth of perturbation and the formation of galaxy clusters in such scenarios are also investigated. Despite the fact that both scenarios may share the same Hubble expansion, we find that matter creation cosmologies predict stronger small scale dynamics which implies a faster growth rate of perturbations with respect to the usual Lambda CDM cosmology. Such results point to the possibility of a crucial observational test confronting CCDM with Lambda CDM scenarios through a more detailed analysis involving CMB, weak lensing, as well as the large-scale structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop an automated spectral synthesis technique for the estimation of metallicities ([Fe/H]) and carbon abundances ([C/Fe]) for metal-poor stars, including carbon-enhanced metal-poor stars, for which other methods may prove insufficient. This technique, autoMOOG, is designed to operate on relatively strong features visible in even low- to medium-resolution spectra, yielding results comparable to much more telescope-intensive high-resolution studies. We validate this method by comparison with 913 stars which have existing high-resolution and low- to medium-resolution to medium-resolution spectra, and that cover a wide range of stellar parameters. We find that at low metallicities ([Fe/H] less than or similar to -2.0), we successfully recover both the metallicity and carbon abundance, where possible, with an accuracy of similar to 0.20 dex. At higher metallicities, due to issues of continuum placement in spectral normalization done prior to the running of autoMOOG, a general underestimate of the overall metallicity of a star is seen, although the carbon abundance is still successfully recovered. As a result, this method is only recommended for use on samples of stars of known sufficiently low metallicity. For these low- metallicity stars, however, autoMOOG performs much more consistently and quickly than similar, existing techniques, which should allow for analyses of large samples of metal-poor stars in the near future. Steps to improve and correct the continuum placement difficulties are being pursued.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Amazon Basin provides an excellent environment for studying the sources, transformations, and properties of natural aerosol particles and the resulting links between biological processes and climate. With this framework in mind, the Amazonian Aerosol Characterization Experiment (AMAZE-08), carried out from 7 February to 14 March 2008 during the wet season in the central Amazon Basin, sought to understand the formation, transformations, and cloud-forming properties of fine-and coarse-mode biogenic aerosol particles, especially as related to their effects on cloud activation and regional climate. Special foci included (1) the production mechanisms of secondary organic components at a pristine continental site, including the factors regulating their temporal variability, and (2) predicting and understanding the cloud-forming properties of biogenic particles at such a site. In this overview paper, the field site and the instrumentation employed during the campaign are introduced. Observations and findings are reported, including the large-scale context for the campaign, especially as provided by satellite observations. New findings presented include: (i) a particle number-diameter distribution from 10 nm to 10 mu m that is representative of the pristine tropical rain forest and recommended for model use; (ii) the absence of substantial quantities of primary biological particles in the submicron mode as evidenced by mass spectral characterization; (iii) the large-scale production of secondary organic material; (iv) insights into the chemical and physical properties of the particles as revealed by thermodenuder-induced changes in the particle number-diameter distributions and mass spectra; and (v) comparisons of ground-based predictions and satellite-based observations of hydrometeor phase in clouds. A main finding of AMAZE-08 is the dominance of secondary organic material as particle components. The results presented here provide mechanistic insight and quantitative parameters that can serve to increase the accuracy of models of the formation, transformations, and cloud-forming properties of biogenic natural aerosol particles, especially as related to their effects on cloud activation and regional climate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show the effects of the granular structure of the initial conditions of a hydrodynamic description of high-energy nucleus-nucleus collisions on some observables, especially on the elliptic-flow parameter upsilon(2). Such a structure enhances production of isotropically distributed high-p(T) particles, making upsilon(2) smaller there. Also, it reduces upsilon(2) in the forward and backward regions where the global matter density is smaller and, therefore, where such effects become more efficacious.