953 resultados para Free viewing
Resumo:
Abstract A finite difference scheme is presented for the solution of the two-dimensional shallow water equations in steady, supercritical flow. The scheme incorporates numerical characteristic decomposition, is shock capturing by design and incorporates space-marching as a result of the assumption that the flow is wholly supercritical in at least one space dimension. Results are shown for problems involving oblique hydraulic jumps and reflection from a wall.
Resumo:
Light Detection And Ranging (LIDAR) is an important modality in terrain and land surveying for many environmental, engineering and civil applications. This paper presents the framework for a recently developed unsupervised classification algorithm called Skewness Balancing for object and ground point separation in airborne LIDAR data. The main advantages of the algorithm are threshold-freedom and independence from LIDAR data format and resolution, while preserving object and terrain details. The framework for Skewness Balancing has been built in this contribution with a prediction model in which unknown LIDAR tiles can be categorised as “hilly” or “moderate” terrains. Accuracy assessment of the model is carried out using cross-validation with an overall accuracy of 95%. An extension to the algorithm is developed to address the overclassification issue for hilly terrain. For moderate terrain, the results show that from the classified tiles detached objects (buildings and vegetation) and attached objects (bridges and motorway junctions) are separated from bare earth (ground, roads and yards) which makes Skewness Balancing ideal to be integrated into geographic information system (GIS) software packages.
Resumo:
Acrylamide forms from free asparagine and sugars during cooking, and products derived from the grain of cereals, including rye, contribute a large proportion of total dietary intake. In this study, free amino acid and sugar concentrations were measured in the grain of a range of rye varieties grown at locations in Hungary, France, Poland, and the United Kingdom and harvested in 2005, 2006, and 2007. Genetic and environmental (location and harvest year) effects on the levels of acrylamide precursors were assessed. The data showed free asparagine concentration to be the main determinant of acrylamide formation in heated rye flour, as it is in wheat. However, in contrast to wheat, sugar, particularly sucrose, concentration also correlated both with asparagine concentration and with acrylamide formed. Free asparagine concentration was shown to be under genetic (G), environmental (E), and integrated (G × E) control. The same was true for glucose, whereas maltose and fructose were affected mainly by environmental factors and sucrose was largely under genetic control. The ratio of variation due to varieties (genotype) to the total variation (a measure of heritability) for free asparagine concentration in the grain was 23%. Free asparagine concentration was closely associated with bran yield, whereas sugar concentration was associated with low Hagberg falling number. Rye grain was found to contain much higher concentrations of free proline than wheat grain, and less acrylamide formed per unit of asparagine in rye than in wheat flour.
Resumo:
We study weak solutions for a class of free-boundary problems which includes as a special case the classical problem of travelling gravity waves on water of finite depth. We show that such problems are equivalent to problems in fixed domains and study the regularity of their solutions. We also prove that in very general situations the free boundary is necessarily the graph of a function.
Resumo:
Through increases in net primary production (NPP), elevated CO2 is hypothesizes to increase the amount of plant litter entering the soil. The fate of this extra carbon on the forest floor or in mineral soil is currently not clear. Moreover, increased rates of NPP can be maintained only if forests can escape nitrogen limitation. In a Free atmospheric CO2 Enrichment (FACE) experiment near Bangor, Wales, 4 ambient CO2 and 4 FACE plots were planted with patches of Betula pendula, Alnus glutinosa and Fagus sylvatica on a former arable field. Four years after establishment, only a shallow L forest floor litter layer had formed due to intensive bioturbation. Total soil C and N contents increased irrespective of treatment and species as a result of afforestation. We could not detect an additional C sink in the soil, nor were soil C stabilization processes affected by FACE. We observed a decrease of leaf N content in Betula and Alnus under FACE, while the soil C/N ratio decreased regardless of CO2 treatment. The ratio of N taken up from the soil and by N2-fixation in Alnus was not affected by FACE. We infer that increased nitrogen use efficiency is the mechanism by which increased NPP is sustained under elevated CO2 at this site.
Resumo:
A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.
Resumo:
The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.
Resumo:
Use of orthogonal space-time block codes (STBCs) with multiple transmitters and receivers can improve signal quality. However, in optical intensity modulated signals, output of the transmitter is non-negative and hence standard orthogonal STBC schemes need to be modified. A generalised framework for applying orthogonal STBCs for free-space IM/DD optical links is presented.
Resumo:
A predominance of small, dense low-density lipoprotein (LDL) is a major component of an atherogenic lipoprotein phenotype, and a common, but modifiable, source of increased risk for coronary heart disease in the free-living population. While much of the atherogenicity of small, dense LDL is known to arise from its structural properties, the extent to which an increase in the number of small, dense LDL particles (hyper-apoprotein B) contributes to this risk of coronary heart disease is currently unknown. This study reports a method for the recruitment of free-living individuals with an atherogenic lipoprotein phenotype for a fish-oil intervention trial, and critically evaluates the relationship between LDL particle number and the predominance of small, dense LDL. In this group, volunteers were selected through local general practices on the basis of a moderately raised plasma triacylglycerol (triglyceride) level (>1.5 mmol/l) and a low concentration of high-density-lipoprotein cholesterol (<1.1 mmol/l). The screening of LDL subclasses revealed a predominance of small, dense LDL (LDL subclass pattern B) in 62% of the cohort. As expected, subjects with LDL subclass pattern B were characterized by higher plasma triacylglycerol and lower high-density lipoprotein cholesterol (<1.1 mmol/l) levels and, less predictably, by lower LDL cholesterol and apoprotein B levels (P<0.05; LDL subclass A compared with subclass B). While hyper-apoprotein B was detected in only five subjects, the relative percentage of small, dense LDL-III in subjects with subclass B showed an inverse relationship with LDL apoprotein B (r=-0.57; P<0.001), identifying a subset of individuals with plasma triacylglycerol above 2.5 mmol/l and a low concentration of LDL almost exclusively in a small and dense form. These findings indicate that a predominance of small, dense LDL and hyper-apoprotein B do not always co-exist in free-living groups. Moreover, if coronary risk increases with increasing LDL particle number, these results imply that the risk arising from a predominance of small, dense LDL may actually be reduced in certain cases when plasma triacylglycerol exceeds 2.5 mmol/l.
Resumo:
The fatty acid composition of the diet of seven free-living subjects (five men and two women) aged 41–56 years was altered for 1 month. The aim was to increase the intake of monounsaturated fatty acids (MUFAs) from subjects current habitual levels of 12% dietary energy to a target intake of 18% dietary energy, and to decrease saturated fatty acid (SFA) from habitual levels of 16% dietary energy to target levels of 10% dietary energy. The change in fatty acid intake was achieved by supplying volunteers with foods prepared using MUFA-containing spreads or olive oil (ready meals, sweet biscuits and cakes) and also by supplying spreads, cooking oil and MUFA-enriched milk for domestic use. Body weight and plasma total cholesterol measurements were made at baseline and at 2 and 4 weeks on the diet as an aid to maintaining subject compliance. MUFA consumption was significantly increased from 12% dietary energy to 16% dietary energy (P<0.01), and SFA intake was reduced from 16% dietary energy to 6% dietary energy (P<0.01) during the 4-week intervention. The diet failed to achieve the target increase in MUFA but exceeded the target reduction in SFA. This was due to the fact that subjects reduced their total fat intake from a mean habitual level of 38% dietary energy to a mean level of 30% dietary energy. During the dietary period, mean plasma cholesterol levels were lower at 2 weeks (P<0.01) and at 4 weeks (P<0.01) than the baseline, with a mean reduction of 20% over the dietary period. This study demonstrates the difficulty of achieving increased MUFA intakes (by SFA substitution) in free-living populations when only a limited range of fatty-acid modified food products are provided to volunteers.
Resumo:
We introduce the perspex machine which unifies projective geometry and Turing computation and results in a supra-Turing machine. We show two ways in which the perspex machine unifies symbolic and non-symbolic AI. Firstly, we describe concrete geometrical models that map perspexes onto neural networks, some of which perform only symbolic operations. Secondly, we describe an abstract continuum of perspex logics that includes both symbolic logics and a new class of continuous logics. We argue that an axiom in symbolic logic can be the conclusion of a perspex theorem. That is, the atoms of symbolic logic can be the conclusions of sub-atomic theorems. We argue that perspex space can be mapped onto the spacetime of the universe we inhabit. This allows us to discuss how a robot might be conscious, feel, and have free will in a deterministic, or semi-deterministic, universe. We ground the reality of our universe in existence. On a theistic point, we argue that preordination and free will are compatible. On a theological point, we argue that it is not heretical for us to give robots free will. Finally, we give a pragmatic warning as to the double-edged risks of creating robots that do, or alternatively do not, have free will.
Resumo:
The primary objective was to compare the fat and fatty acid contents of cooked retail chickens from intensive and free range systems. Total fat comprised approximately 14, 2.5, 8, 9 and 15 g/100 g cooked weight in whole birds, skinless breast, breast with skin, skinless leg and leg meat with skin, respectively, with no effect of intensive compared with free range systems. Free range breast and leg meat contained significantly less polyunsaturated fatty acids (n-6 and n-3) than did those from intensive rearing and had a consistently higher n-6/n-3 ratio (6.0 vs. 7.9). Generally, the concentrations of long chain n-3 fatty acids were considerably lower than those reported in earlier research studies. Overall, there was no evidence that meat from free range chickens had a fatty acid profile that would be classified as healthier than that from intensively reared birds and indeed, in some aspects, the opposite was the case. (C) 2011 Elsevier Ltd. All rights reserved.