934 resultados para Decisional frame of reference
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A measurement of differential cross sections for the production of a pair of isolated photons in proton-proton collisions at root s = 7 TeV is presented. The data sample corresponds to an integrated luminosity of 5.0 fb(-1) collected with the CMS detector. A data-driven isolation template method is used to extract the prompt diphoton yield. The measured cross section for two isolated photons, with transverse energy above 40 and 25 GeV respectively, in the pseudorapidity range vertical bar eta vertical bar < 2.5, vertical bar eta vertical bar (sic) [1.44, 1.57] and with an angular separation Delta R > 0.45, is 17.2 +/-0.2 (stat) +/-1.9 (syst) +/- 0.4 (lumi) pb. Differential cross sections are measured as a function of the diphoton invariant mass, the diphoton transverse momentum, the azimuthal angle difference between the two photons, and the cosine of the polar angle in the Collins-Soper reference frame of the diphoton system. The results are compared to theoretical predictions at leading, next-to-leading, and next-to-next-to-leading order in quantum chromodynamics.
Spatial reference of black capuchin monkeys in Brazilian Atlantic Forest: egocentric or allocentric?
Resumo:
Wild primates occupy large home ranges and travel long distances to reach goals. However, how primates are able to remember goal locations and travel efficiently is unclear. Few studies present consistent results regarding what reference system primates use to navigate, and what kind of spatial information they recognize. We analysed the pattern of navigation of one wild group of black capuchin monkeys, Cebus nigritus, at Atlantic Forest for 100 days in Carlos Botelho State Park (PECB), Brazil. We tested predictions based on the alternative hypotheses that black capuchin monkeys navigate using a sequence of landmarks as an egocentric reference system or an allocentric reference system, or both, depending on availability of food resources. The group location was recorded using a GPS device collecting coordinates at 5 min intervals, and route maps were generated using ArcView v9.3.1. The study group travelled through habitual routes during less than 30% of our study sample, and revisited resources from different starting points, using different paths and routes, even when prominent landmarks near feeding locations were not visible. The study group used habitual routes more frequently when high-quality foods were scarce, and navigated using different paths when revisiting food sources. Results support the hypothesis that black capuchin monkeys at PECB navigate using both egocentric and allocentric systems of reference, depending on the quality and distribution of the food resource they find. (C) 2010 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.
Resumo:
The determination of hydrodynamic coefficients of full scale underwater vehicles using system identification (SI) is an extremely powerful technique. The procedure is based on experimental runs and on the analysis of on-board sensors and thrusters signals. The technique is cost effective and it has high repeatability; however, for open-frame underwater vehicles, it lacks accuracy due to the sensors' noise and the poor modeling of thruster-hull and thruster-thruster interaction effects. In this work, forced oscillation tests were undertaken with a full scale open-frame underwater vehicle. These conducted tests are unique in the sense that there are not many examples in the literature taking advantage of a PMM installation for testing a prototype and; consequently, allowing the comparison between the experimental results and the ones estimated by parameter identification. The Morison's equation inertia and drag coefficients were estimated with two parameter identification methods, that is, the weighted and the ordinary least-squares procedures. It was verified that the in-line force estimated from Morison's equation agrees well with the measured one except in the region around the motion inversion points. On the other hand, the error analysis showed that the ordinary least-squares provided better accuracy and, therefore, was used to evaluate the ratio between inertia and drag forces for a range of Keulegan-Carpenter and Reynolds numbers. It was concluded that, although both experimental and estimation techniques proved to be powerful tools for evaluation of an open-frame underwater vehicle's hydrodynamic coefficients, the research provided a rich amount of reference data for comparison with reduced models as well as for dynamic motion simulation of ROVs. [DOI: 10.1115/1.4004952]
Resumo:
The estimation of reference evapotranspiration (ETo), used in water balance, allows to determine soil water content, assisting on irrigation management. The present study aimed to compare simple ETo estimating methods with the Penman-Monteith (FAO), in the folowing time scales: daily, 5, 10, 15 and 30 days and monthly in the counties of Frederico Westphalen and Palmeira das Missoes, in the Rio Grande do Sul state, Brazil. The methods tested had their efficiency improved by increasing the time scale of analysis, keeping the same performance for both locations. The highest and lowest ETo values occurred in December and June, respectively. Most methods underestimated ETo. For any of the time scales Makking and Radiaton FAO24 methods can replace the Penman-Monteith for estimating ETo.
Resumo:
Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.
Resumo:
Methane yield of ligno-cellulosic substrates (i.e. dedicated energy crops and agricultural residues) may be limited by their composition and structural features. Hence, biomass pre-treatments are envisaged to overcome this constraint. This thesis aimed at: i) assessing biomass and methane yield of dedicated energy crops; ii) evaluating the effects of hydrothermal pre-treatments on methane yield of Arundo; iii) investigating the effects of NaOH pre-treatments and iv) acid pre-treatments on chemical composition, physical structure and methane yield of two dedicated energy crops and one agricultural residue. Three multi-annual species (Arundo, Switchgrass and Sorghum Silk), three sorghum hybrids (Trudan Headless, B133 and S506) and a maize, as reference for AD, were studied in the frame of point i). Results exhibit the remarkable variation in biomass yield, chemical characteristics and potential methane yield. The six species alternative to maize deserve attention in view of a low need of external inputs but necessitate improvements in biodegradability. In the frame of point ii), Arundo was subjected to hydrothermal pre-treatments at different temperature, time and acid catalyst (with and without H2SO4). Pre-treatments determined a variable effect on methane yield: pre-treatments without acid catalyst achieved up to +23% CH4 output, while pre-treatments with H2SO4 catalyst incurred a methanogenic inhibition. Two biomass crops (Arundo and B133) and an agricultural residue (Barley straw) were subject to NaOH and acid pre-treatments, in the frame of point iii) and iv), respectively. Different pre-treatments determined a change of chemical and physical structure and an increase of methane yield: up to +30% and up to +62% CH4 output in Arundo with NaOH and acid pre-treatments, respectively. It is thereby demonstrated that pre-treatments can actually enhance biodegradability and subsequent CH4 output of ligno-cellulosic substrates, although pre-treatment viability needs to be evaluated at the level of full scale biogas plants in a perspective of profitable implementation.
Resumo:
Quantitative sensory tests are widely used in human research to evaluate the effect of analgesics and explore altered pain mechanisms, such as central sensitization. In order to apply these tests in clinical practice, knowledge of reference values is essential. The aim of this study was to determine the reference values of pain thresholds for mechanical and thermal stimuli, as well as withdrawal time for the cold pressor test in 300 pain-free subjects. Pain detection and pain tolerance thresholds to pressure, heat and cold were determined at three body sites: (1) lower back, (2) suprascapular region and (3) second toe (for pressure) or the lateral aspect of the leg (for heat and cold). The influences of gender, age, height, weight, body-mass index (BMI), body side of testing, depression, anxiety, catastrophizing and parameters of Short-Form 36 (SF-36) were analyzed by multiple regressions. Quantile regressions were performed to define the 5th, 10th and 25th percentiles as reference values for pain hypersensitivity and the 75th, 90th and 95th percentiles as reference values for pain hyposensitivity. Gender, age and/or the interaction of age with gender were the only variables that consistently affected the pain measures. Women were more pain sensitive than men. However, the influence of gender decreased with increasing age. In conclusion, normative values of parameters related to pressure, heat and cold pain stimuli were determined. Reference values have to be stratified by body region, gender and age. The determination of these reference values will now allow the clinical application of the tests for detecting abnormal pain reactions in individual patients.
Resumo:
The use of virtual reality as tool in the area of spatial cognition raises the question of the quality of learning transfer from a virtual to a real environment. It is first necessary to determine with healthy subjects, the cognitive aids that improve the quality of transfer and the conditions required, especially since virtual reality can be used as effective tool in cognitive rehabilitation. The purpose of this study was to investigate the influence of the exploration mode of virtual environment (Passive vs. Active) according to Route complexity (Simple vs. Complex) on the quality of spatial knowledge transfer in three spatial tasks. Ninety subjects (45 men and 45 women) participated. Spatial learning was evaluated by Wayfinding, sketch-mapping and picture classification tasks in the context of the Bordeaux district. In the Wayfinding task, results indicated that active learning in a Virtual Environment (VE) increased the performances compared to the passive learning condition, irrespective of the route complexity factor. In the Sketch-mapping task, active learning in a VE helped the subjects to transfer their spatial knowledge from the VE to reality, but only when the route was complex. In the Picture classification task, active learning in a VE when the route was complex did not help the subjects to transfer their spatial knowledge. These results are explained in terms of knowledge levels and frame/strategy of reference [SW75, PL81, TH82].
Resumo:
We report a case of an acute hypertensive, intracerebral hemorrhage on post-mortem computed tomography (PMCT) in a decomposed corpse. In clinical radiology, the appearance of blood on cross-sectional imaging is used to estimate the age of intracranial hemorrhage. The findings from this case indicate that characteristics of intracerebral blood on PMCT provide a still frame of the hemorrhage, as it was at the time of death. This observation suggests that the appearance of blood on PMCT may be used to estimate the age of an intracerebral hemorrhage but not to estimate the post-mortem interval.
Resumo:
Three long-term temperature data series measured in Portugal were studied to detect and correct non-climatic homogeneity breaks and are now available for future studies of climate variability. Series of monthly minimum (Tmin) and maximum (Tmax) temperatures measured in the three Portuguese meteorological stations of Lisbon (from 1856 to 2008), Coimbra (from 1865 to 2005) and Porto (from 1888 to 2001) were studied to detect and correct non-climatic homogeneity breaks. These series together with monthly series of average temperature (Taver) and temperature range (DTR) derived from them were tested in order to detect homogeneity breaks, using, firstly, metadata, secondly, a visual analysis and, thirdly, four widely used homogeneity tests: von Neumann ratio test, Buishand test, standard normal homogeneity test and Pettitt test. The homogeneity tests were used in absolute (using temperature series themselves) and relative (using sea-surface temperature anomalies series obtained from HadISST2 close to the Portuguese coast or already corrected temperature series as reference series) modes. We considered the Tmin, Tmax and DTR series as most informative for the detection of homogeneity breaks due to the fact that Tmin and Tmax could respond differently to changes in position of a thermometer or other changes in the instrument's environment; Taver series have been used, mainly, as control. The homogeneity tests show strong inhomogeneity of the original data series, which could have both internal climatic and non-climatic origins. Homogeneity breaks which have been identified by the last three mentioned homogeneity tests were compared with available metadata containing data, such as instrument changes, changes in station location and environment, observing procedures, etc. Significant homogeneity breaks (significance 95% or more) that coincide with known dates of instrumental changes have been corrected using standard procedures. It was also noted that some significant homogeneity breaks, which could not be connected to the known dates of any changes in the park of instruments or stations location and environment, could be caused by large volcanic eruptions. The corrected series were again tested for homogeneity: the corrected series were considered free of non-climatic breaks when the tests of most of monthly series showed no significant (significance 95% or more) homogeneity breaks that coincide with dates of known instrument changes. Corrected series are now available in the frame of ERA-CLIM FP7 project for future studies of climate variability.
Resumo:
This paper presents results of the benchmarking of COBAYA3 pin-by-pin for VVER-1000 obtained in the frame of the EU NURISP project. The 3D lattice solver in COBAYA3 uses transport corrected multi-group diffusion approximation with side-dependent interface discontinuity factors of GET or Selengut Black Box type. The objective of this study is to test the few-group calculation scheme when using structur ed and unstructured spatial meshes. Unstructured mesh is necessary to model the water gaps between the hexagonal assemblies. The benchmark problems include pin-by-pin calculations of 2D subsets of the core and comparison with APOLLO2 and TR IPOLI4 transport reference solutions. COBAYA3 solutions in 2, 4 and 8 energy groups have been tested. The results show excellent agreement with the reference on es when using side-dependent interface discontinuity factors.
Resumo:
"Class M.E. Periodical and serial literature, 1869-88", part 3, p. 109-133.
Resumo:
This is the story of an extraordinary Aboriginal woman, Princy Carlo, and the identity of place she and her descendants fashioned within the confines of the Aboriginal settlement of Cherbourg (formerly Barambah), during the early twentieth century. The patch of Cherbourg that came to be known as 'Chinatown' has to date attracted cursory reference in historical commentary on the south-eastern Queensland Aboriginal settlement. Yet, hidden beneath what may appear as an inconsequential historical detail lies a fascinating illustration of the negotiation of place identity within a frame of triangulated group relations (Aboriginal-Chinese-White) in what remained, in essence, a colonial society. Incorporating primary written sources and oral accounts from descendants the study analyses the forging of the Chinatown identity of place through a process of 'spatial othering', eliciting features unique to this indigenous identity-construct. The study provides an insight into Aboriginal connection and kinship with land following forced removal to a government settlement, and contributes to the historical records of the Cherbourg Aboriginal community and the Eidsvold district in Queensland, Australia. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
A cikk Oliver Hart és szerzőtársai modelljeinek következtetéseit hasonlítja össze Williamson tranzakciós költségekre vonatkozó nézeteivel. Megmutatja, hogy a két irányzat a vállalat vagy piac kérdéskörében más eszközöket használ, de hasonlóan érvel. Megismerkedhetünk Williamson Harttal szemben megfogalmazott azon kritikájával, hogy Hart modelljeiben az alkunak nincsenek tranzakciós költségei, illetve a kritika kritikájával is. Hart elképzeléseit támasztja alá a tulajdonjogi irányzaton belül nemrégiben kialakult referenciapont-elmélet, amely kísérleti lehetőségeket is nyújt a különböző feltételezések igazolására. ____ The article compares the conclusions from the models of Oliver Hart et al. with the views of Williamson on transaction costs. It shows that the two schools use different means on the question of the firm or the market, but similar reasoning. The author covers Williamson's criticism of Hart that there are no transaction costs in his models, and also the criticism of that criticism. Hart's notions are supported by the recently developed theory of reference point within the property-right trend, which offers chances of experimental proof of the various assumptions.