871 resultados para scalable to larger studies


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time series of transports in the Agulhas region have been constructed by simulating Lagrangian drifter trajectories in a 1/10 degree two-way nested ocean model. Using these 34 year long time series it is shown that smaller (larger) Agulhas Current transport leads to larger (smaller) Indian-Atlantic inter-ocean exchange. When transport is low, the Agulhas Current detaches farther downstream from the African continental slope. Moreover, the lower inertia suppresses generation of anti-cyclonic vorticity. These two effects cause the Agulhas retroflection to move westward and enhance Agulhas leakage. In the model a 1 Sv decrease in Agulhas Current transport at 32°S results in a 0.7 ± 0.2 Sv increase in Agulhas leakage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The calculation of accurate and reliable vibrational potential functions and normal co-ordinates is discussed, for such simple polyatomic molecules as it may be possible. Such calculations should be corrected for the effects of anharmonicity and of resonance interactions between the vibrational states, and should be fitted to all the available information on all isotopic species: particularly the vibrational frequencies, Coriolis zeta constants and centrifugal distortion constants. The difficulties of making these corrections, and of making use of the observed data are reviewed. A programme for the Ferranti Mercury Computer is described by means of which harmonic vibration frequencies and normal co-ordinate vectors, zeta factors and centrifugal distortion constants can be calculated, from a given force field and from given G-matrix elements, etc. The programme has been used on up to 5 × 5 secular equations for which a single calculation and output of results takes approximately l min; it can readily be extended to larger determinants. The best methods of using such a programme and the possibility of reversing the direction of calculation are discussed. The methods are applied to calculating the best possible vibrational potential function for the methane molecule, making use of all the observed data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surfactin is a bacterial lipopeptide produced by Bacillus subtilis and is a powerful surfactant, having also antiviral, antibacterial and antitumor properties. The recovery and purification of surfactin from complex fermentation broths is a major obstacle to its commercialization; therefore, a two-step membrane filtration process was developed using a lab scale tangential flow filtration (TFF) unit with 10 kDa MWCO regenerated cellulose (RC) and polyethersulfone (PES)membranes at three different transmembrane pressure (TMP) of 1.5 bar, 2.0 bar and 2.5 bar. Two modes of filtrations were studied, with and without cleaning of membranes prior to UF-2. In a first step of ultrafiltration (UF-1), surfactin was retained effectively by membranes at above its critical micelle concentration (CMC); subsequently in UF-2, the retentate micelles were disrupted by addition of 50% (v/v) methanol solution to allow recovery of surfactin in the permeate. Main protein contaminants were effectively retained by the membrane in UF-2. Flux of permeates, rejection coefficient (R) of surfactin and proteinwere measured during the filtrations. Overall the three different TMPs applied have no significant effect in the filtrations and PES is the more suitable membrane to selectively separate surfactin from fermentation broth, achieving high recovery and level of purity. In addition this two-step UF process is scalable for larger volume of samples without affecting the original functionality of surfactin, although membranes permeability can be affected due to exposure to methanolic solution used in UF-2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time resolved gas-phase kinetic studies have contributed a great deal of fundamental information about the reactions and reactivity of heavy carbenes (silylenes, germylenes and stannylenes) during the past two decades. In this article we trace the development of our understanding through the mechanistic themes of intermediate complexes, third body assisted associations, catalysed reactions, non-observed reactions and substituent effects. Ab initio (quantum chemical) calculations have substantially assisted mechanistic interpretation and are discussed where appropriate. Trends in reactivity are identified and some signposts to future studies are indicated. This review, although detailed, is not comprehensive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose of review This review critically evaluates recent studies investigating the effects of fatty acids on immune and inflammatory responses in both healthy individuals and in patients with inflammatory diseases, with some reference to animal studies where relevant. It examines recent findings describing the cellular and molecular basis for the modulation of immune function by fatty acids. The newly emerging area of diet-genotype interactions will also be discussed, with specific reference to the anti-inflammatory effects of fish oil. Recent findings Fatty acids are participants in many intracellular signalling pathways. They act as ligands for nuclear receptors regulating a host of cell responses, they influence the stability of lipid rafts, and modulate eicosanoid metabolism in cells of the immune system. Recent findings suggest that some or all of these mechanisms may be involved in the modulation of immune function by fatty acids. Summary Human studies investigating the relationship between dietary fatty acids and some aspects of the immune response have been disappointingly inconsistent. This review presents the argument that most studies have not been adequately powered to take into account the influence of variation (genotypic or otherwise) on parameters of immune function. There is well-documented evidence that fatty acids modulate T lymphocyte activation, and recent findings describe a range of potential cellular and molecular mechanisms. However, there are still many questions remaining, particularly with respect to the roles of nuclear receptors, for which fatty acids act as ligands, and the modulation of eicosanoid synthesis, for which fatty acids act as precursors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nutrition science finds itself at a major crossroad. On the one hand we can continue the current path, which has resulted in some substantial advances, but also many conflicting messages which impair the trust of the general population, especially those who are motivated to improve their health through diet. The other road is uncharted and is being built over the many exciting new developments in life sciences. This new era of nutrition recognizes the complex relation between the health of the individual, its genome, and the life-long dietary exposure, and has lead to the realisation that nutrition is essentially a gene - environment interaction science. This review on the relation between genotype, diet and health is the first of a series dealing with the major challenges in molecular nutrition, analyzing the foundations of nutrition research. With the unravelling of the human genome and the linking of its variability to a multitude of phenotypes from " healthy'' to an enormously complex range of predispositions, the dietary modulation of these propensities has become an area of active research. Classical genetic approaches applied so far in medical genetics have steered away from incorporating dietary effects in their models and paradoxically, most genetic studies analyzing diet-associated phenotypes and diseases simply ignore diet. Yet, a modest but increasing number of studies are accounting for diet as a modulator of genetic associations. These range from observational cohorts to intervention studies with prospectively selected genotypes. New statistical and bioinformatics approaches are becoming available to aid in design and evaluation of these studies. This review discusses the various approaches used and provides concrete recommendations for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To compare insulin sensitivity (Si) from a frequently sampled intravenous glucose tolerance test (FSIGT) and subsequent minimal model analyses with surrogate measures of insulin sensitivity and resistance and to compare features of the metabolic syndrome between Caucasians and Indian Asians living in the UK. SUBJECTS: In all, 27 healthy male volunteers (14 UK Caucasians and 13 UK Indian Asians), with a mean age of 51.2 +/- 1.5 y, BMI of 25.8 +/- 0.6 kg/m(2) and Si of 2.85 +/- 0.37. MEASUREMENTS: Si was determined from an FSIGT with subsequent minimal model analysis. The concentrations of insulin, glucose and nonesterified fatty acids (NEFA) were analysed in fasting plasma and used to calculate surrogate measure of insulin sensitivity (quantitative insulin sensitivity check index (QUICKI), revised QUICKI) and resistance (homeostasis for insulin resistance (HOMA IR), fasting insulin resistance index (FIRI), Bennetts index, fasting insulin, insulin-to-glucose ratio). Plasma concentrations of triacylglycerol (TAG), total cholesterol, high density cholesterol, (HDL-C) and low density cholesterol, (LDL-C) were also measured in the fasted state. Anthropometric measurements were conducted to determine body-fat distribution. RESULTS: Correlation analysis identified the strongest relationship between Si and the revised QUICKI (r = 0.67; P = 0.000). Significant associations were also observed between Si and QUICKI (r = 0.51; P = 0.007), HOMA IR (r = -0.50; P = 0.009), FIRI and fasting insulin. The Indian Asian group had lower HDL-C (P = 0.001), a higher waist-hip ratio (P = 0.01) and were significantly less insulin sensitive (Si) than the Caucasian group (P = 0.02). CONCLUSION: The revised QUICKI demonstrated a statistically strong relationship with the minimal model. However, it was unable to differentiate between insulin-sensitive and -resistant groups in this study. Future larger studies in population groups with varying degrees of insulin sensitivity are recommended to investigate the general applicability of the revised QUICKI surrogate technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fast Knowledge-based Evolution Strategy, KES, for the multi-objective minimum spanning tree, is presented. The proposed algorithm is validated, for the bi-objective case, with an exhaustive search for small problems (4-10 nodes), and compared with a deterministic algorithm, EPDA and NSGA-II for larger problems (up to 100 nodes) using benchmark hard instances. Experimental results show that KES finds the true Pareto fronts for small instances of the problem and calculates good approximation Pareto sets for larger instances tested. It is shown that the fronts calculated by YES are superior to NSGA-II fronts and almost as good as those established by EPDA. KES is designed to be scalable to multi-objective problems and fast due to its small complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, an overview of some of the latest developments in the field of cerebral cortex to computer interfacing (CCCI) is given. This is posed in the more general context of Brain-Computer Interfaces in order to assess advantages and disadvantages. The emphasis is clearly placed on practical studies that have been undertaken and reported on, as opposed to those speculated, simulated or proposed as future projects. Related areas are discussed briefly only in the context of their contribution to the studies being undertaken. The area of focus is notably the use of invasive implant technology, where a connection is made directly with the cerebral cortex and/or nervous system. Tests and experimentation which do not involve human subjects are invariably carried out a priori to indicate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies from this area are discussed. The paper goes on to describe human experimentation, in which neural implants have linked the human nervous system bidirectionally with technology and the internet. A view is taken as to the prospects for the future for CCCI, in terms of its broad therapeutic role.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: We conducted a systematic review of studies examining relationships between measures of beverage alcohol tax or price levels and alcohol sales or self-reported drinking. A total of 112 studies of alcohol tax or price effects were found, containing 1003 estimates of the tax/price–consumption relationship. Design: Studies included analyses of alternative outcome measures, varying subgroups of the population, several statistical models, and using different units of analysis. Multiple estimates were coded from each study, along with numerous study characteristics. Using reported estimates, standard errors, t-ratios, sample sizes and other statistics, we calculated the partial correlation for the relationship between alcohol price or tax and sales or drinking measures for each major model or subgroup reported within each study. Random-effects models were used to combine studies for inverse variance weighted overall estimates of the magnitude and significance of the relationship between alcohol tax/price and drinking. Findings: Simple means of reported elasticities are -0.46 for beer, -0.69 for wine and -0.80 for spirits. Meta-analytical results document the highly significant relationships (P < 0.001) between alcohol tax or price measures and indices of sales or consumption of alcohol (aggregate-level r = -0.17 for beer, -0.30 for wine, -0.29 for spirits and -0.44 for total alcohol). Price/tax also affects heavy drinking significantly (mean reported elasticity = -0.28, individual-level r = -0.01, P < 0.01), but the magnitude of effect is smaller than effects on overall drinking. Conclusions: A large literature establishes that beverage alcohol prices and taxes are related inversely to drinking. Effects are large compared to other prevention policies and programs. Public policies that raise prices of alcohol are an effective means to reduce drinking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel algorithm for joint state-parameter estimation using sequential three dimensional variational data assimilation (3D Var) and demonstrate its application in the context of morphodynamic modelling using an idealised two parameter 1D sediment transport model. The new scheme combines a static representation of the state background error covariances with a flow dependent approximation of the state-parameter cross-covariances. For the case presented here, this involves calculating a local finite difference approximation of the gradient of the model with respect to the parameters. The new method is easy to implement and computationally inexpensive to run. Experimental results are positive with the scheme able to recover the model parameters to a high level of accuracy. We expect that there is potential for successful application of this new methodology to larger, more realistic models with more complex parameterisations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The benefits of sector and regional diversification have been well documented in the literature but have not previously been investigated in Italy. In addition, previous studies have used geographically defined regions, rather than economically functional areas, when performing the analysis even though most would argue that it is the economic structure of the area that will lead to differences in demand and hence property performance. This study therefore uses economically defined regions of Italy to test the relative benefits of regional diversification versus sector diversification within the Italian real estate portfolio. To examine this issue we use constrained cross-section regressions the on the sector and regional affiliation of 14 cities in Italy to extract the “pure” return effects of the different factors using annual data over the period 1989 to 2003. In contrast, to previous studies we find that regional factors effects in Italy have a much greater influence on property returns than sector-specific effects, which is probably a direct result of using the extremely diverse economic regions of Italy rather than arbitrary geographically locations. Be that as it may, the results strongly suggest that that diversification across the regions of Italy used here is likely to offer larger risk reduction benefits than a sector diversification strategy within a region. In other words, fund managers in Italy must monitor the regional composition of their portfolios more closely than its sector allocation. Additionally, the results supports that contemporary position that ‘regional areas’ based on economic function, provide greater diversification benefits rather than areas defined by geographical location.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global horizontal wavenumber kinetic energy spectra and spectral fluxes of rotational kinetic energy and enstrophy are computed for a range of vertical levels using a T799 ECMWF operational analysis. Above 250 hPa, the kinetic energy spectra exhibit a distinct break between steep and shallow spectral ranges, reminiscent of dual power-law spectra seen in aircraft data and high-resolution general circulation models. The break separates a large-scale ‘‘balanced’’ regime in which rotational flow strongly dominates divergent flow and a mesoscale ‘‘unbalanced’’ regime where divergent energy is comparable to or larger than rotational energy. Between 230 and 100 hPa, the spectral break shifts to larger scales (from n 5 60 to n 5 20, where n is spherical harmonic index) as the balanced component of the flow preferentially decays. The location of the break remains fairly stable throughout the stratosphere. The spectral break in the analysis occurs at somewhat larger scales than the break seen in aircraft data. Nonlinear spectral fluxes defined for the rotational component of the flow maximize between about 300 and 200 hPa. Large-scale turbulence thus centers on the extratropical tropopause region, within which there are two distinct mechanisms of upscale energy transfer: eddy–eddy interactions sourcing the transient energy peak in synoptic scales, and zonal mean–eddy interactions forcing the zonal flow. A well-defined downscale enstrophy flux is clearly evident at these altitudes. In the stratosphere, the transient energy peak moves to planetary scales and zonal mean–eddy interactions become dominant.