70 resultados para diffractive methodology,


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Response surface methodology was used to study the effect of temperature, cutting time, and calcium chloride addition level on curd moisture content, whey fat losses, and curd yield. Coagulation and syneresis were continuously monitored using 2 optical sensors detecting light backscatter. The effect of the factors on the sensors’ response was also examined. Retention of fat during cheese making was found to be a function of cutting time and temperature, whereas curd yield was found to be a function of those 2 factors and the level of calcium chloride addition. The main effect of temperature on curd moisture was to increase the rate at which whey was expelled. Temperature and calcium chloride addition level were also found to affect the light backscatter profile during coagulation whereas the light backscatter profile during syneresis was a function of temperature and cutting time. The results of this study suggest that there is an optimum firmness at which the gel should be cut to achieve maximum retention of fat and an optimum curd moisture content to maximize product yield and quality. It was determined that to maximize curd yield and quality, it is necessary to maximize firmness while avoiding rapid coarsening of the gel network and microsyneresis. These results could contribute to the optimization of the cheese-making process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new approach to the study of the local organization in amorphous polymer materials is presented. The method couples neutron diffraction experiments that explore the structure on the spatial scale 1–20 Å with the reverse Monte Carlo fitting procedure to predict structures that accurately represent the experimental scattering results over the whole momentum transfer range explored. Molecular mechanics and molecular dynamics techniques are also used to produce atomistic models independently from any experimental input, thereby providing a test of the viability of the reverse Monte Carlo method in generating realistic models for amorphous polymeric systems. An analysis of the obtained models in terms of single chain properties and of orientational correlations between chain segments is presented. We show the viability of the method with data from molten polyethylene. The analysis derives a model with average C-C and C-H bond lengths of 1.55 Å and 1.1 Å respectively, average backbone valence angle of 112, a torsional angle distribution characterized by a fraction of trans conformers of 0.67 and, finally, a weak interchain orientational correlation at around 4 Å.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the changes in the length of commercial property leases over the last decade and presents an analysis of the consequent investment and occupational pricing implications for commercial property investmentsIt is argued that the pricing implications of a short lease to an investor are contingent upon the expected costs of the letting termination to the investor, the probability that the letting will be terminated and the volatility of rental values.The paper examines the key factors influencing these variables and presents a framework for incorporating their effects into pricing models.Approaches to their valuation derived from option pricing are critically assessed. It is argued that such models also tend to neglect the price effects of specific risk factors such as tenant circumstances and the terms of break clause. Specific risk factors have a significant bearing on the probability of letting termination and on the level of the resultant financial losses. The merits of a simulation methododology are examined for rental and capital valuations of short leases and properties with break clauses.It is concluded that in addition to the rigour of its internal logic, the success of any methodology is predicated upon the accuracy of the inputs.The lack of reliable data on patterns in, and incidence of, lease termination and the lack of reliable time series of historic property performance limit the efficacy of financial models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Blood clotting response (BCR) resistance tests are available for a number of anticoagulant rodenticides. However, during the development of these tests many of the test parameters have been changed, making meaningful comparisons between results difficult. It was recognised that a standard methodology was urgently required for future BCR resistance tests and, accordingly, this document presents a reappraisal of published tests, and proposes a standard protocol for future use (see Appendix). The protocol can be used to provide information on the incidence and degree of resistance in a particular rodent population; to provide a simple comparison of resistance factors between active ingredients, thus giving clear information about cross-resistance for any given strain; and to provide comparisons of susceptibility or resistance between different populations. The methodology has a sound statistical basis in being based on the ED50 response, and requires many fewer animals than the resistance tests in current use. Most importantly, tests can be used to give a clear indication of the likely practical impact of the resistance on field efficacy. The present study was commissioned and funded by the Rodenticide Resistance Action Committee (RRAC) of CropLife International.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maincrop potato yields in Scotland have increased by 3035 similar to t similar to ha-1 since 1960 as a result of many changes, but has changing climate contributed anything to this? The purpose of this work was to answer this question. Daily weather data for the period 19602006 were analysed for five locations covering the zones of potato growing on the east coast of Scotland (between 55.213 and 57.646 similar to N) to determine trends in temperature, rainfall and solar radiation. A physiologically based potato yield model was validated using data obtained from a long-term field trial in eastern Scotland and then employed to simulate crop development and potential yield at each of the five sites. Over the 47 similar to years, there were significant increases in annual air and 30 similar to cm soil temperatures (0.27 and 0.30 similar to K similar to decade-1, respectively), but no significant changes in annual precipitation or in the timing of the last frost in spring and the first frost of autumn. There was no evidence of any north to south gradient of warming. Simulated emergence and canopy closure became earlier at all five sites over the period with the advance being greater in the north (3.7 and 3.6 similar to days similar to decade-1, respectively) than the south (0.5 and 0.8 similar to days similar to decade-1, respectively). Potential yield increased with time, generally reflecting the increased duration of the green canopy, at average rates of 2.8 similar to t similar to ha-1 decade-1 for chitted seed (sprouted prior to planting) and 2.5 similar to t similar to ha-1 decade-1 for unchitted seed. The measured warming could contribute potential yield increases of up to 13.2 similar to t similar to ha-1 for chitted potato (range 7.119.3 similar to t similar to ha-1) and 11.5 similar to t similar to ha-1 for unchitted potato (range 7.115.5 similar to t similar to ha-1) equivalent to 3439% of the increased potential yield over the period or 2326% of the increase in actual measured yields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Salmonella enterica serotypes Derby, Mbandaka, Montevideo, Livingstone, and Senftenberg were among the 10 most prevalent serotypes isolated from farm animals in England and Wales in 1999. These serotypes are of potential zoonotic relevance; however, there is currently no "gold standard" fingerprinting method for them. A collection of isolates representing the former serotypes and serotype Gold Coast were analyzed using plasmid profiling, pulsed-field gel electrophoresis (PFGE), and ribotyping. The success of the molecular methods in identifying DNA polymorphisms was different for each serotype. Plasmid profiling was particularly useful for serotype Derby isolates, and it also provided a good level of discrimination for serotype Senftenberg. For most serotypes, we observed a number of nontypeable plasmid-free strains, which represents a limitation of this technique. Fingerprinting of genomic DNA by ribotyping and PFGE produced a significant variation in results, depending on the serotype of the strain. Both PstI/SphI ribotyping and XbaI-PFGE provided a similar degree of strain differentiation for serotype Derby and serotype Senftenberg, only marginally lower than that achieved by plasmid profiling. Ribotyping was less sensitive than PFGE when applied to serotype Mbandaka or serotype Montevideo. Serotype Gold Coast isolates were found to be nontypeable by XbaI-PFGE, and a significant proportion of them were found to be plasmid free. A similar situation applies to a number of serotype Livingstone isolates which were nontypeable by plasmid profiling and/or PFGE. In summary, the serotype of the isolates has a considerable influence in deciding the best typing strategy; a single method cannot be relied upon for discriminating between strains, and a combination of typing methods allows further discrimination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inducing rules from very large datasets is one of the most challenging areas in data mining. Several approaches exist to scaling up classification rule induction to large datasets, namely data reduction and the parallelisation of classification rule induction algorithms. In the area of parallelisation of classification rule induction algorithms most of the work has been concentrated on the Top Down Induction of Decision Trees (TDIDT), also known as the ‘divide and conquer’ approach. However powerful alternative algorithms exist that induce modular rules. Most of these alternative algorithms follow the ‘separate and conquer’ approach of inducing rules, but very little work has been done to make the ‘separate and conquer’ approach scale better on large training data. This paper examines the potential of the recently developed blackboard based J-PMCRI methodology for parallelising modular classification rule induction algorithms that follow the ‘separate and conquer’ approach. A concrete implementation of the methodology is evaluated empirically on very large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Twitter network has been labelled the most commonly used microblogging application around today. With about 500 million estimated registered users as of June, 2012, Twitter has become a credible medium of sentiment/opinion expression. It is also a notable medium for information dissemination; including breaking news on diverse issues since it was launched in 2007. Many organisations, individuals and even government bodies follow activities on the network in order to obtain knowledge on how their audience reacts to tweets that affect them. We can use postings on Twitter (known as tweets) to analyse patterns associated with events by detecting the dynamics of the tweets. A common way of labelling a tweet is by including a number of hashtags that describe its contents. Association Rule Mining can find the likelihood of co-occurrence of hashtags. In this paper, we propose the use of temporal Association Rule Mining to detect rule dynamics, and consequently dynamics of tweets. We coined our methodology Transaction-based Rule Change Mining (TRCM). A number of patterns are identifiable in these rule dynamics including, new rules, emerging rules, unexpected rules and ?dead' rules. Also the linkage between the different types of rule dynamics is investigated experimentally in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a simple sieving methodology to aid the recovery of large cultigen pollen grains, such as maize (Zea mays L.), manioc (Manihot esculenta Crantz), and sweet potato (Ipomoea batatas L.), among others, for the detection of food production using fossil pollen analysis of lake sediments in the tropical Americas. The new methodology was tested on three large study lakes located next to known and/or excavated pre-Columbian archaeological sites in South and Central America. Five paired samples, one treated by sieving, the other prepared using standard methodology, were compared for each of the three sites. Using the new methodology, chemically digested sediment samples were passed through a 53 µm sieve, and the residue was retained, mounted in silicone oil, and counted for large cultigen pollen grains. The filtrate was mounted and analysed for pollen according to standard palynological procedures. Zea mays (L.) was recovered from the sediments of all three study lakes using the sieving technique, where no cultigen pollen had been previously recorded using the standard methodology. Confidence intervals demonstrate there is no significant difference in pollen assemblages between the sieved versus unsieved samples. Equal numbers of exotic Lycopodium spores added to both the filtrate and residue of the sieved samples allow for direct comparison of cultigen pollen abundance with the standard terrestrial pollen count. Our technique enables the isolation and rapid scanning for maize and other cultigen pollen in lake sediments, which, in conjunction with charcoal and pollen records, is key to determining land-use patterns and the environmental impact of pre-Columbian societies.