994 resultados para Large unilamellar vesicles


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although current research indicates that increasing the number of options has negative effects on the cognitive ability of consumers, little understanding has been given to the consequences on producers and their strategic behavior. This article tests whether a large portfolio of products is beneficial to producers by observing UK consumer response to price promotions. The article shows that discounts induce mainly segment switching (74% of the total impact), with a limited effect on stockpiling (26%) and no impact on purchase incidence. Consequently, consumers prefer to “follow the discount” rather than purchase multiple units of the same wine. This result seems to explain the current structure of the market, and suggests that discounts may conflict with segment loyalty, a situation that disfavors producers, particularly in very populated segments. Results also casts doubts on the economic sustainability of competition based on an intense product differentiation in the wine sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global climate and weather models tend to produce rainfall that is too light and too regular over the tropical ocean. This is likely because of convective parametrizations, but the problem is not well understood. Here, distributions of precipitation rates are analyzed for high-resolution UK Met Office Unified Model simulations of a 10 day case study over a large tropical domain (∼20°S–20°N and 42°E–180°E). Simulations with 12 km grid length and parametrized convection have too many occurrences of light rain and too few of heavier rain when interpolated onto a 1° grid and compared with Tropical Rainfall Measuring Mission (TRMM) data. In fact, this version of the model appears to have a preferred scale of rainfall around 0.4 mm h−1 (10 mm day−1), unlike observations of tropical rainfall. On the other hand, 4 km grid length simulations with explicit convection produce distributions much more similar to TRMM observations. The apparent preferred scale at lighter rain rates seems to be a feature of the convective parametrization rather than the coarse resolution, as demonstrated by results from 12 km simulations with explicit convection and 40 km simulations with parametrized convection. In fact, coarser resolution models with explicit convection tend to have even more heavy rain than observed. Implications for models using convective parametrizations, including interactions of heating and moistening profiles with larger scales, are discussed. One important implication is that the explicit convection 4 km model has temperature and moisture tendencies that favour transitions in the convective regime. Also, the 12 km parametrized convection model produces a more stable temperature profile at its extreme high-precipitation range, which may reduce the chance of very heavy rainfall. Further study is needed to determine whether unrealistic precipitation distributions are due to some fundamental limitation of convective parametrizations or whether parametrizations can be improved, in order to better simulate these distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluated the accuracy of six watershed models of nitrogen export in streams (kg km2 yr−1) developed for use in large watersheds and representing various empirical and quasi-empirical approaches described in the literature. These models differ in their methods of calibration and have varying levels of spatial resolution and process complexity, which potentially affect the accuracy (bias and precision) of the model predictions of nitrogen export and source contributions to export. Using stream monitoring data and detailed estimates of the natural and cultural sources of nitrogen for 16 watersheds in the northeastern United States (drainage sizes = 475 to 70,000 km2), we assessed the accuracy of the model predictions of total nitrogen and nitrate-nitrogen export. The model validation included the use of an error modeling technique to identify biases caused by model deficiencies in quantifying nitrogen sources and biogeochemical processes affecting the transport of nitrogen in watersheds. Most models predicted stream nitrogen export to within 50% of the measured export in a majority of the watersheds. Prediction errors were negatively correlated with cultivated land area, indicating that the watershed models tended to over predict export in less agricultural and more forested watersheds and under predict in more agricultural basins. The magnitude of these biases differed appreciably among the models. Those models having more detailed descriptions of nitrogen sources, land and water attenuation of nitrogen, and water flow paths were found to have considerably lower bias and higher precision in their predictions of nitrogen export.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The time evolution of the circulation change at the end of the Baiu season is investigated using ERA40 data. An end-day is defined for each of the 23 years based on the 850 hPa θe value at 40˚Nin the 130-140˚E sector exceeding 330 K. Daily time series of variables are composited with respect to this day. These composite time-series exhibit a clearer and more rapid change in the precipitation and the large-scale circulation over the whole East Asia region than those performed using calendar days. The precipitation change includes the abrupt end of the Baiu rain, the northward shift of tropical convection perhaps starting a few days before this, and the start of the heavier rain at higher latitudes. The northward migration of lower tropospheric warm, moist tropical air, a general feature of the seasonal march in the region, is fast over the continent and slow over the ocean. By mid to late July the cooler air over the Sea of Japan is surrounded on 3 sides by the tropical air. It is suggestive that the large-scale stage has been set for a jump to the post-Baiu state, i.e., for the end of the Baiu season. Two likely triggers for the actual change emerge from the analysis. The first is the northward movement of tropical convection into the Philippine region. The second is an equivalent barotropic Rossby wave-train, that over a 10-day period develops downstream across Eurasia. It appears likely that in most years one or both mechanisms can be important in triggering the actual end of the Baiu season.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Escherichia coli O26:K60, with genetic attributes consistent with a potentially human enterohaemorrhagic E coli was isolated from the faeces of an eight-month-old heifer with dysentery. Attaching and effacing lesions were identified in the colon of a similarly affected heifer examined postmortem, and shown to be associated with E coli O26 by specific immunolabelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shiga-toxigenic Escherichia coli O157:H7 (STEC O157:H7) is associated with potentially fatal human disease, and a persistent reservoir of the organism is present in some farm animal species, especially cattle and sheep. The mechanisms of persistent colonisation of the ruminant intestine by STEC O157:H7 are poorly understood but may be associated with intimate adherence to eukaryotic cells. Intimate adherence, as evidenced by induction of attaching-effacing (AE) lesions by STEC O157, has been observed in 6-day-old conventional lambs after deliberate oral infection but not in older animals. Thus, the present study used a ligated intestinal loop technique to investigate whether STEC O157:H7 and other attaching-effacing E. coli may adhere intimately to the sheep large intestinal mucosa. To do this, four STEC O157:H7 strains, one STEC 026:K60:H11 and one Shiga toxin-negative E. coli O157:H7 strain, suspended in either phosphate-buffered saline or Dulbecco's modified Eagle's medium, were inoculated into ligated spiral colon loops of each of two lambs. The loops were removed 6 h after inoculation, fixed and examined by light and electron microscopy. AE lesions on the intestinal mucosa were produced by all the inoculated strains. However, the lesions were sparse and small, typically comprising bacterial cells intimately adhered to a single enterocyte, or a few adjacent enterocytes. There was little correlation between the extent of intimate adherence in this model and the bacterial cell density, pre-inoculation growth conditions of the bacteria or the strain tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the super-Brownian motion with a single point source in dimensions 2 and 3 as constructed by Fleischmann and Mueller in 2004. Using analytic facts we derive the long time behavior of the mean in dimension 2 and 3 thereby complementing previous work of Fleischmann, Mueller and Vogt. Using spectral theory and martingale arguments we prove a version of the strong law of large numbers for the two dimensional superprocess with a single point source and finite variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the biomimetic design two hydrophobic pentapetides Boc-Ile-Aib-Leu-Phe-Ala-OMe ( I) and Boc-Gly-Ile-Aib-Leu-Phe-OMe (II) (Aib: alpha-aminoisobutyric acid) containing one Aib each are found to undergo solvent assisted self-assembly in methanol/water to form vesicular structures, which can be disrupted by simple addition of acid. The nanovesicles are found to encapsulate dye molecules that can be released by the addition of acid as confirmed by fluorescence microscopy and UV studies. The influence of solvent polarity on the morphology of the materials generated from the peptides has been examined systematically, and shows that fibrillar structures are formed in less polar chloroform/petroleum ether mixture and vesicular structures are formed in more polar methanol/water. Single crystal X-ray diffraction studies reveal that while beta-sheet mediated self-assembly leads to the formation of fibrillar structures, the solvated beta-sheet structure leads to the formation of vesicular structures. The results demonstrate that even hydrophobic peptides can generate vesicular structures from polar solvent which may be employed in model studies of complex biological phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous Building Information Modelling (BIM) tools are well established and potentially beneficial in certain uses. However, issues of adoption and implementation persist, particularly for on-site use of BIM tools in the construction phase. We describe an empirical case-study of the implementation of an innovative ‘Site BIM’ system on a major hospital construction project. The main contractor on the project developed BIM-enabled tools to allow site workers using mobile tablet personal computers to access design information and to capture work quality and progress data on-site. Accounts show that ‘Site BIM’, while judged to be successful and actively supporting users, was delivered through an exploratory and emergent development process of informal prototyping. Technical IT skills were adopted into the construction project through personal relationships and arrangements rather than formal processes. Implementation was driven by construction project employees rather than controlled centrally by the corporate IT function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared to observations during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. The parameterized convection model simulations at both 40 km and 12 km grid spacing have a very weak MJO signal and little eastward propagation. A 4 km explicit convection simulation using Smagorinsky subgrid mixing in the vertical and horizontal dimensions exhibits the best MJO strength and propagation speed. 12 km explicit convection simulations also perform much better than the 12 km parameterized convection run, suggesting that the convection scheme, rather than horizontal resolution, is key for these MJO simulations. Interestingly, a 4 km explicit convection simulation using the conventional boundary layer scheme for vertical subgrid mixing (but still using Smagorinsky horizontal mixing) completely loses the large-scale MJO organization, showing that relatively high resolution with explicit convection does not guarantee a good MJO simulation. Models with a good MJO representation have a more realistic relationship between lower-free-tropospheric moisture and precipitation, supporting the idea that moisture-convection feedback is a key process for MJO propagation. There is also increased generation of available potential energy and conversion of that energy into kinetic energy in models with a more realistic MJO, which is related to larger zonal variance in convective heating and vertical velocity, larger zonal temperature variance around 200 hPa, and larger correlations between temperature and ascent (and between temperature and diabatic heating) between 500–400 hPa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmospheric Rivers (ARs), narrow plumes of enhanced moisture transport in the lower troposphere, are a key synoptic feature behind winter flooding in midlatitude regions. This article develops an algorithm which uses the spatial and temporal extent of the vertically integrated horizontal water vapor transport for the detection of persistent ARs (lasting 18 h or longer) in five atmospheric reanalysis products. Applying the algorithm to the different reanalyses in the vicinity of Great Britain during the winter half-years of 1980–2010 (31 years) demonstrates generally good agreement of AR occurrence between the products. The relationship between persistent AR occurrences and winter floods is demonstrated using winter peaks-over-threshold (POT) floods (with on average one flood peak per winter). In the nine study basins, the number of winter POT-1 floods associated with persistent ARs ranged from approximately 40 to 80%. A Poisson regression model was used to describe the relationship between the number of ARs in the winter half-years and the large-scale climate variability. A significant negative dependence was found between AR totals and the Scandinavian Pattern (SCP), with a greater frequency of ARs associated with lower SCP values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare the characteristics of synthetic European droughts generated by the HiGEM1 coupled climate model run with present day atmospheric composition with observed drought events extracted from the CRU TS3 data set. The results demonstrate consistency in both the rate of drought occurrence and the spatiotemporal structure of the events. Estimates of the probability density functions for event area, duration and severity are shown to be similar with confidence > 90%. Encouragingly, HiGEM is shown to replicate the extreme tails of the observed distributions and thus the most damaging European drought events. The soil moisture state is shown to play an important role in drought development. Once a large-scale drought has been initiated it is found to be 50% more likely to continue if the local soil moisture is below the 40th percentile. In response to increased concentrations of atmospheric CO2, the modelled droughts are found to increase in duration, area and severity. The drought response can be largely attributed to temperature driven changes in relative humidity. 1 HiGEM is based on the latest climate configuration of the Met Office Hadley Centre Unified Model (HadGEM1) with the horizontal resolution increased to 1.25 x 0.83 degrees in longitude and latitude in the atmosphere and 1/3 x 1/3 degrees in the ocean.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Droughts tend to evolve slowly and affect large areas simultaneously, which suggests that improved understanding of spatial coherence of drought would enable better mitigation of drought impacts through enhanced monitoring and forecasting strategies. This study employs an up-to-date dataset of over 500 river flow time series from 11 European countries, along with a gridded precipitation dataset, to examine the spatial coherence of drought in Europe using regional indicators of precipitation and streamflow deficit. The drought indicators were generated for 24 homogeneous regions and, for selected regions, historical drought characteristics were corroborated with previous work. The spatial coherence of drought characteristics was then examined at a European scale. Historical droughts generally have distinctive signatures in their spatio-temporal development, so there was limited scope for using the evolution of historical events to inform forecasting. Rather, relationships were explored in time series of drought indicators between regions. Correlations were generally low, but multivariate analyses revealed broad continental-scale patterns, which appear to be related to large-scale atmospheric circulation indices (in particular, the North Atlantic Oscillation and the East Atlantic West Russia pattern). A novel methodology for forecasting was developed (and demonstrated with reference to the United Kingdom), which predicts drought from drought i.e. uses spatial coherence of drought to facilitate early warning of drought in a target region, from drought which is developing elsewhere in Europe.Whilst the skill of the methodology is relatively modest at present, this approach presents a potential new avenue for forecasting, which offers significant advantages in that it allows prediction for all seasons, and also shows some potential for forecasting the termination of drought conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.