44 resultados para 3D and 2D background modelling

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Periocular recognition has recently become an active topic in biometrics. Typically it uses 2D image data of the periocular region. This paper is the first description of combining 3D shape structure with 2D texture. A simple and effective technique using iterative closest point (ICP) was applied for 3D periocular region matching. It proved its strength for relatively unconstrained eye region capture, and does not require any training. Local binary patterns (LBP) were applied for 2D image based periocular matching. The two modalities were combined at the score-level. This approach was evaluated using the Bosphorus 3D face database, which contains large variations in facial expressions, head poses and occlusions. The rank-1 accuracy achieved from the 3D data (80%) was better than that for 2D (58%), and the best accuracy (83%) was achieved by fusing the two types of data. This suggests that significant improvements to periocular recognition systems could be achieved using the 3D structure information that is now available from small and inexpensive sensors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Dermatosparaxis (Ehlers–Danlos syndrome in humans) is characterized by extreme fragility of the skin. It is due to the lack of mature collagen caused by a failure in the enzymatic processing of procollagen I. We investigated the condition in a commercial sheep flock. Hypothesis/Objectives Mutations in the ADAM metallopeptidase with thrombospondin type 1 motif, 2 (ADAMTS2) locus, are involved in the development of dermatosparaxis in humans, cattle and the dorper sheep breed; consequently, this locus was investigated in the flock. Animals A single affected lamb, its dam, the dam of a second affected lamb and the rams in the flock were studied. Methods DNA was purified from blood, PCR primers were used to detect parts of the ADAMS2 gene and nucleotide sequencing was performed using Sanger's procedure. Skin samples were examined using standard histology procedures. Results A missense mutation was identified in the catalytic domain of ADAMTS2. The mutation is predicted to cause the substitution in the mature ADAMTS2 of a valine molecule by a methionine molecule (V15M) affecting the catalytic domain of the enzyme. Both the ‘sorting intolerant from tolerant’ (SIFT) and the PolyPhen-2 methodologies predicted a damaging effect for the mutation. Three-dimensional modelling suggested that this mutation may alter the stability of the protein folding or distort the structure, causing the protein to malfunction. Conclusions and clinical importance Detection of the mutation responsible for the pathology allowed us to remove the heterozygote ram, thus preventing additional cases in the flock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We survey the literature on spatial bio-economic and land-use modelling and assess its thematic development. Unobserved site-specific heterogeneity is a feature of almost all the surveyed works, and this feature, it seems, has stimulated significant methodological innovation. In an attempt to improve the suitability with which the prototype incorporates heterogeneity, we consider modelling alternatives and extensions. We discuss solutions and conjecture others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We survey the literature on spatial bio-economic and land-use modelling and assess its thematic development. Unobserved site-specific heterogeneity is a feature of almost all the surveyed works, and this feature, it seems, has stimulated significant methodological innovation. In an attempt to improve the suitability with which the prototype incorporates heterogeneity, we consider modelling alternatives and extensions. We discuss solutions and conjecture others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Palaeoproxy records alone are seldom sufficient to provide a full assessment of regional palaeoclimates. To better understand the possible changes in the Mediterranean climate during the Holocene, a series of palaeoclimate integrations for periods spanning the last 12 000 years have been performed and their results diagnosed. These simulations use the HadSM3 global climate model, which is then dynamically downscaled to approximately 50 km using a consistent regional climate model (HadRM3). Changes in the model’s seasonal-mean surface air temperatures and precipitation are discussed at both global and regional scales, along with the physical mechanisms underlying the changes. It is shown that the global model reproduces many of the large-scale features of the mid-Holocene climate (consistent with previous studies) and that the results suggest that many areas within the Mediterranean region were wetter during winter with a stronger seasonal cycle of surface air temperatures during the early Holocene. This precipitation signal in the regional model is strongest in the in the northeast Mediterranean (near Turkey), consistent with low-level wind patterns and earlier palaeosyntheses. It is, however, suggested that further work is required to fully understand the changes in the winter circulation patterns over the Mediterranean region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new approach to modelling flash floods in dryland catchments by integrating remote sensing and digital elevation model (DEM) data in a geographical information system (GIS). The spectral reflectance of channels affected by recent flash floods exhibit a marked increase, due to the deposition of fine sediments in these channels as the flood recedes. This allows the parts of a catchment that have been affected by a recent flood event to be discriminated from unaffected parts, using a time series of Landsat images. Using images of the Wadi Hudain catchment in southern Egypt, the hillslope areas contributing flow were inferred for different flood events. The SRTM3 DEM was used to derive flow direction, flow length, active channel cross-sectional areas and slope. The Manning Equation was used to estimate the channel flow velocities, and hence the time-area zones of the catchment. A channel reach that was active during a 1985 runoff event, that does not receive any tributary flow, was used to estimate a transmission loss rate of 7·5 mm h−1, given the maximum peak discharge estimate. Runoff patterns resulting from different flood events are quite variable; however the southern part of the catchment appears to have experienced more floods during the period of study (1984–2000), perhaps because the bedrock hillslopes in this area are more effective at runoff production than other parts of the catchment which are underlain by unconsolidated Quaternary sands and gravels. Due to high transmission loss, runoff generated within the upper reaches is rarely delivered to the alluvial fan and Shalateen city situated at the catchment outlet. The synthetic GIS-based time area zones, on their own, cannot be relied on to model the hydrographs reliably; physical parameters, such as rainfall intensity, distribution, and transmission loss, must also be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent report of the Milburn Review into Social Mobility highlights the under-representation of young people from lower socio-economic groups in higher education and encourages universities and others to act to remedy this situation as a contribution to greater social mobility. The paper uses data from the Longitudinal Study of Young People in England to examine the relationship between social background, attainment and university participation. The results show that differences in school-level attainment associated with social background are by far the most important explanation for social background differences in university attendance. However, there remains a small proportion of the participation gap that is not accounted for by attainment. It is also the case that early intentions for higher education participation are highly predictive of actual participation. The results suggest that although there may be some scope for universities to act to improve participation by people from less advantaged backgrounds, a much more important focus of action is on improving the school-level achievement of these students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A modelling study is presented which investigates in-situ generated changes of the thermosphere and ionosphere during a solar eclipse. Neutral temperatures are expected to drop by up to 40 degrees K at 240 km height in the totality footprint, with neutral winds of up to 26 m/s responding to the change of pressure. Both temperatures and winds are found to respond with a time lag of 30 min after the passing of the Moon's shadow. A gravity wave is generated in the neutral atmosphere and propagates into the opposite hemisphere at around 300 m/s. The combined effects of thermal cooling and downwelling lead to an overall increase in [O], while [N(2)] initially rises and then for several hours after the eclipse is below the "steady state" level. An enhancement of [NmF2] is found and explained by the atmosphere's contraction during, and the reduced [O]/[N(2)] ratio after the eclipse.