901 resultados para Subfractals, Subfractal Coding, Model Analysis, Digital Imaging, Pattern Recognition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of tracking line segments corresponding to on-line handwritten obtained through a digitizer tablet. The approach is based on Kalman filtering to model linear portions of on-line handwritten, particularly, handwritten numerals, and to detect abrupt changes in handwritten direction underlying a model change. This approach uses a Kalman filter framework constrained by a normalized line equation, where quadratic terms are linearized through a first-order Taylor expansion. The modeling is then carried out under the assumption that the state is deterministic and time-invariant, while the detection relies on double thresholding mechanism which tests for a violation of this assumption. The first threshold is based on an approach of layout kinetics. The second one takes into account the jump in angle between the past observed direction of layout and its current direction. The method proposed enables real-time processing. To illustrate the methodology proposed, some results obtained from handwritten numerals are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The principal driver of nitrogen (N) losses from the body including excretion and secretion in milk is N intake. However, other covariates may also play a role in modifying the partitioning of N. This study tests the hypothesis that N partitioning in dairy cows is affected by energy and protein interactions. A database containing 470 dairy cow observations was collated from calorimetry experiments. The data include N and energy parameters of the diet and N utilization by the animal. Univariate and multivariate meta-analyses that considered both within and between study effects were conducted to generate prediction equations based on N intake alone or with an energy component. The univariate models showed that there was a strong positive linear relationships between N intake and N excretion in faeces, urine and milk. The slopes were 0.28 faeces N, 0.38 urine N and 0.20 milk N. Multivariate model analysis did not improve the fit. Metabolizable energy intake had a significant positive effect on the amount of milk N in proportion to faeces and urine N, which is also supported by other studies. Another measure of energy considered as a covariate to N intake was diet quality or metabolizability (the concentration of metabolizable energy relative to gross energy of the diet). Diet quality also had a positive linear relationship with the proportion of milk N relative to N excreted in faeces and urine. Metabolizability had the largest effect on faeces N due to lower protein digestibility of low quality diets. Urine N was also affected by diet quality and the magnitude of the effect was higher than for milk N. This research shows that including a measure of diet quality as a covariate with N intake in a model of N execration can enhance our understanding of the effects of diet composition on N losses from dairy cows. The new prediction equations developed in this study could be used to monitor N losses from dairy systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High quality wind measurements in cities are needed for numerous applications including wind engineering. Such data-sets are rare and measurement platforms may not be optimal for meteorological observations. Two years' wind data were collected on the BT Tower, London, UK, showing an upward deflection on average for all wind directions. Wind tunnel simulations were performed to investigate flow distortion around two scale models of the Tower. Using a 1:160 scale model it was shown that the Tower causes a small deflection (ca. 0.5°) compared to the lattice on top on which the instruments were placed (ca. 0–4°). These deflections may have been underestimated due to wind tunnel blockage. Using a 1:40 model, the observed flow pattern was consistent with streamwise vortex pairs shed from the upstream lattice edge. Correction factors were derived for different wind directions and reduced deflection in the full-scale data-set by <3°. Instrumental tilt caused a sinusoidal variation in deflection of ca. 2°. The residual deflection (ca. 3°) was attributed to the Tower itself. Correction of the wind-speeds was small (average 1%) therefore it was deduced that flow distortion does not significantly affect the measured wind-speeds and the wind climate statistics are reliable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scope: Fibers and prebiotics represent a useful dietary approach for modulating the human gut microbiome. Therefore, aim of the present study was to investigate the impact of four flours (wholegrain rye, wholegrain wheat, chickpeas and lentils 50:50, and barley milled grains), characterized by a naturally high content in dietary fibers, on the intestinal microbiota composition and metabolomic output. Methods and results: A validated three-stage continuous fermentative system simulating the human colon was used to resemble the complexity and diversity of the intestinal microbiota. Fluorescence in situ hybridization was used to evaluate the impact of the flours on the composition of the microbiota, while small-molecule metabolome was assessed by NMR analysis followed by multivariate pattern recognition techniques. HT29 cell-growth curve assay was used to evaluate the modulatory properties of the bacterial metabolites on the growth of intestinal epithelial cells. All the four flours showed positive modulations of the microbiota composition and metabolic activity. Furthermore, none of the flours influenced the growth-modulatory potential of the metabolites toward HT29 cells. Conclusion: Our findings support the utilization of the tested ingredients in the development of a variety of potentially prebiotic food products aimed at improving gastrointestinal health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract. In a recent paper Hu et al. (2011) suggest that the recovery of stratospheric ozone during the first half of this century will significantly enhance free tropospheric and surface warming caused by the anthropogenic increase of greenhouse gases, with the effects being most pronounced in Northern Hemisphere middle and high latitudes. These surprising results are based on a multi-model analysis of CMIP3 model simulations with and without prescribed stratospheric ozone recovery. Hu et al. suggest that in order to properly quantify the tropospheric and surface temperature response to stratospheric ozone recovery, it is necessary to run coupled atmosphere-ocean climate models with stratospheric ozone chemistry. The results of such an experiment are presented here, using a state-of-the-art chemistry-climate model coupled to a three-dimensional ocean model. In contrast to Hu et al., we find a much smaller Northern Hemisphere tropospheric temperature response to ozone recovery, which is of opposite sign. We suggest that their result is an artifact of the incomplete removal of the large effect of greenhouse gas warming between the two different sets of models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many global climate models (GCMs) have trouble simulating Southern Annular Mode (SAM) variability correctly, particularly in the Southern Hemisphere summer season where it tends to be too persistent. In this two part study, a suite of experiments with the Canadian Middle Atmosphere Model (CMAM) is analyzed to improve our understanding of the dynamics of SAM variability and its deficiencies in GCMs. Here, an examination of the eddy-mean flow feedbacks is presented by quantification of the feedback strength as a function of zonal scale and season using a new methodology that accounts for intraseasonal forcing of the SAM. In the observed atmosphere, in the summer season, a strong negative feedback by planetary scale waves, in particular zonal wavenumber 3, is found in a localized region in the south west Pacific. It cancels a large proportion of the positive feedback by synoptic and smaller scale eddies in the zonal mean, resulting in a very weak overall eddy feedback on the SAM. CMAM is deficient in this negative feedback by planetary scale waves, making a substantial contribution to its bias in summertime SAM persistence. Furthermore, this bias is not alleviated by artificially improving the climatological circulation, suggesting that climatological circulation biases are not the cause of the planetary wave feedback deficiency in the model. Analysis of the summertime eddy feedbacks in the CMIP-5 models confirms that this is indeed a common problem among GCMs, suggesting that understanding this planetary wave feedback and the reason for its deficiency in GCMs is key to improving the fidelity of simulated SAM variability in the summer season.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal was to quantitatively estimate and compare the fidelity of images acquired with a digital imaging system (ADAR 5500) and generated through scanning of color infrared aerial photographs (SCIRAP) using image-based metrics. Images were collected nearly simultaneously in two repetitive flights to generate multi-temporal datasets. Spatial fidelity of ADAR was lower than that of SCIRAP images. Radiometric noise was higher for SCIRAP than for ADAR images, even though noise from misregistration effects was lower. These results suggest that with careful control of film scanning, the overall fidelity of SCIRAP imagery can be comparable to that of digital multispectral camera data. Therefore, SCIRAP images can likely be used in conjunction with digital metric camera imagery in long-term landcover change analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have incorporated a semi-mechanistic isoprene emission module into the JULES land-surface scheme, as a first step towards a modelling tool that can be applied for studies of vegetation – atmospheric chemistry interactions, including chemistry-climate feedbacks. Here, we evaluate the coupled model against local above-canopy isoprene emission flux measurements from six flux tower sites as well as satellite-derived estimates of isoprene emission over tropical South America and east and south Asia. The model simulates diurnal variability well: correlation coefficients are significant (at the 95 % level) for all flux tower sites. The model reproduces day-to-day variability with significant correlations (at the 95 % confidence level) at four of the six flux tower sites. At the UMBS site, a complete set of seasonal observations is available for two years (2000 and 2002). The model reproduces the seasonal pattern of emission during 2002, but does less well in the year 2000. The model overestimates observed emissions at all sites, which is partially because it does not include isoprene loss through the canopy. Comparison with the satellite-derived isoprene-emission estimates suggests that the model simulates the main spatial patterns, seasonal and inter-annual variability over tropical regions. The model yields a global annual isoprene emission of 535 ± 9 TgC yr−1 during the 1990s, 78 % of which from forested areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims To investigate the relationship between adiposity and plasma free fatty acid levels and the influence of total plasma free fatty acid level on insulin sensitivity and β-cell function. Methods An insulin sensitivity index, acute insulin response to glucose and a disposition index, derived from i.v. glucose tolerance minimal model analysis and total fasting plasma free fatty acid levels were available for 533 participants in the Reading, Imperial, Surrey, Cambridge, Kings study. Bivariate correlations were made between insulin sensitivity index, acute insulin response to glucose and disposition index and both adiposity measures (BMI, waist circumference and body fat mass) and total plasma free fatty acid levels. Multivariate linear regression analysis was performed, controlling for age, sex, ethnicity and adiposity. Results After adjustment, all adiposity measures were inversely associated with insulin sensitivity index (BMI: β = −0.357; waist circumference: β = −0.380; body fat mass: β = −0.375) and disposition index (BMI: β = −0.215; waist circumference: β = −0.248; body fat mass: β = −0.221) and positively associated with acute insulin response to glucose [BMI: β = 0.200; waist circumference: β = 0.195; body fat mass β = 0.209 (P values <0.001)]. Adiposity explained 13, 4 and 5% of the variation in insulin sensitivity index, acute insulin response to glucose and disposition index, respectively. After adjustment, no adiposity measure was associated with free fatty acid level, but total plasma free fatty acid level was inversely associated with insulin sensitivity index (β = −0.133), acute insulin response to glucose (β = −0.148) and disposition index [β = −0.218 (P values <0.01)]. Plasma free fatty acid concentration accounted for 1.5, 2 and 4% of the variation in insulin sensitivity index, acute insulin response to glucose and disposition index, respectively. Conclusions Plasma free fatty acid levels have a modest negative association with insulin sensitivity, β-cell secretion and disposition index but no association with adiposity measures. It is unlikely that plasma free fatty acids are the primary mediators of obesity-related insulin resistance or β-cell dysfunction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For general home monitoring, a system should automatically interpret people’s actions. The system should be non-intrusive, and able to deal with a cluttered background, and loose clothes. An approach based on spatio-temporal local features and a Bag-of-Words (BoW) model is proposed for single-person action recognition from combined intensity and depth images. To restore the temporal structure lost in the traditional BoW method, a dynamic time alignment technique with temporal binning is applied in this work, which has not been previously implemented in the literature for human action recognition on depth imagery. A novel human action dataset with depth data has been created using two Microsoft Kinect sensors. The ReadingAct dataset contains 20 subjects and 19 actions for a total of 2340 videos. To investigate the effect of using depth images and the proposed method, testing was conducted on three depth datasets, and the proposed method was compared to traditional Bag-of-Words methods. Results showed that the proposed method improves recognition accuracy when adding depth to the conventional intensity data, and has advantages when dealing with long actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital imaging technologies enable a mastery of the visual that in recent mainstream cinema frequently manifests as certain kinds of spatial reach, orientation and motion. In such a context Michael Bay’s Transformers franchise can be framed as a digital re-tooling of a familiar fantasy of vehicular propulsion, US car culture writ large in digitally crafted spectacles of diegetic speed, the vehicular chase film ‘2.0’. Movement is central to these films, calling up Scott Bukatman’s observation that in spectacular visual media ‘movement has become more than a tool of bodily knowledge; it has become an end in itself’ (2003: 125). Not all movements and not all instances of vehicular propulsion are the same however. How might we evaluate what is at stake in a film’s assertion of movement as an end in itself, and the form that assertion takes, its articulations of diegetic velocity, corporeality, and spatial penetration? Deploying an attentiveness towards the specificity of aesthetic detail and affective impact in Bay’s delineation of movement, this essay suggests that the franchise poses questions about the relationship of human movement to machine movement that exceed their narrative basis. Identifying a persistent rotational trope in the franchise that in its audio-visual articulation combines oddly anachronistic elements (evoking the mechanical rather than the digital), the article argues that the films prioritise certain fantasies of transformation and spatial penetration, and certain modes of corporeality, as one response to contemporary debates about digital technologisation, sustainable energy, and cinematic spectacle. In this way the franchise also represents a particular moment in a more widely discernible preoccupation in contemporary cinema with what we might call a ‘rotational aesthetics’ of action, a machine movement made possible by the digital, but which invokes earlier histories and fantasies of animation, propulsion, mechanization and mechanization to particular ends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence and development of digital imaging technologies and their impact on mainstream filmmaking is perhaps the most familiar special effects narrative associated with the years 1981-1999. This is in part because some of the questions raised by the rise of the digital still concern us now, but also because key milestone films showcasing advancements in digital imaging technologies appear in this period, including Tron (1982) and its computer generated image elements, the digital morphing in The Abyss (1989) and Terminator 2: Judgment Day (1991), computer animation in Jurassic Park (1993) and Toy Story (1995), digital extras in Titanic (1997), and ‘bullet time’ in The Matrix (1999). As a result it is tempting to characterize 1981-1999 as a ‘transitional period’ in which digital imaging processes grow in prominence and technical sophistication, and what we might call ‘analogue’ special effects processes correspondingly become less common. But such a narrative risks eliding the other practices that also shape effects sequences in this period. Indeed, the 1980s and 1990s are striking for the diverse range of effects practices in evidence in both big budget films and lower budget productions, and for the extent to which analogue practices persist independently of or alongside digital effects work in a range of production and genre contexts. The chapter seeks to document and celebrate this diversity and plurality, this sustaining of earlier traditions of effects practice alongside newer processes, this experimentation with materials and technologies old and new in the service of aesthetic aspirations alongside budgetary and technical constraints. The common characterization of the period as a series of rapid transformations in production workflows, practices and technologies will be interrogated in relation to the persistence of certain key figures as Douglas Trumbull, John Dykstra, and James Cameron, but also through a consideration of the contexts for and influences on creative decision-making. Comparative analyses of the processes used to articulate bodies, space and scale in effects sequences drawn from different generic sites of special effects work, including science fiction, fantasy, and horror, will provide a further frame for the chapter’s mapping of the commonalities and specificities, continuities and variations in effects practices across the period. In the process, the chapter seeks to reclaim analogue processes’ contribution both to moments of explicit spectacle, and to diegetic verisimilitude, in the decades most often associated with the digital’s ‘arrival’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ozone dynamics depend on meteorological characteristics such as wind, radiation, sunshine, air temperature and precipitation. The aim of this study was to determine ozone trajectories along the northern coast of Portugal during the summer months of 2005, when there was a spate of forest fires in the region, evaluating their impact on respiratory and cardiovascular health in the greater metropolitan area of Porto. We investigated the following diseases, as coded in the ninth revision of the International Classification of Diseases: hypertensive disease (codes 401-405); ischemic heart disease (codes 410-414); other cardiac diseases, including heart failure (codes 426-428); chronic obstructive pulmonary disease and allied conditions, including bronchitis and asthma (codes 490-496); and pneumoconiosis and other lung diseases due to external agents (codes 500-507). We evaluated ozone data from air quality monitoring stations in the study area, together with data collected through HYbrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model analysis of air mass circulation and synoptic-scale zonal wind from National Centers for Environmental Prediction data. High ozone levels in rural areas were attributed to the dispersion of pollutants induced by local circulation, as well as by mesoscale and synoptic scale processes. The fires of 2005 increased the levels of pollutants resulting from the direct emission of gases and particles into the atmosphere, especially when there were incoming frontal systems. For the meteorological case studies analyzed, peaks in ozone concentration were positively associated with higher rates of hospital admissions for cardiovascular diseases, although there were no significant associations between ozone peaks and admissions for respiratory diseases.