947 resultados para Distribution generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a summary of what is known from social science research about the effects parents have on the donations of their children. It then goes on to summarize two on-going research projects. The first project provides estimates of the strength of the relationship between the charitable giving of parents and that of their adult children. The second provides estimates of the effect of inheritances on charitable donations. Both projects use data from the Center on Philanthropy Panel Study (COPPS); accordingly, the paper provides an introduction to these data. Finally, the paper draws implications for fundraisers from the two on-going projects, and suggests several other areas in which COPPS can generate knowledge to improve the practice of fundraising.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unsteady natural convection inside a triangular cavity has been studied in this study. The cavity is filled with a saturated porous medium with non-isothermal left inclined wall while the bottom surface is isothermally heated and the right inclined surface is isothermally cooled. An internal heat generation is also considered which is dependent on the fluid temperature. The governing equations are solved numerically by finite volume method. The Prandtl number, Pr of the fluid is considered as 0.7 (air) while the aspect ratio and the Rayleigh number, Ra are considered as 0.5 and 105 respectively. The effect of heat generation on the fluid flow and heat transfer have been presented as a form of streamlines and isotherms. The rate of heat transfer through three surfaces of the enclosure is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban stormwater quality is multifaceted and the use of a limited number of factors to represent catchment characteristics may not be adequate to explain the complexity of water quality response to a rainfall event or site-to-site differences in stormwater quality modelling. This paper presents the outcomes of a research study which investigated the adequacy of using land use and impervious area fraction only, to represent catchment characteristics in urban stormwater quality modelling. The research outcomes confirmed the inadequacy of the use of these two parameters alone to represent urban catchment characteristics in stormwater quality prediction. Urban form also needs to be taken into consideration as it was found have an important impact on stormwater quality by influencing pollutant generation, build-up and wash-off. Urban form refers to characteristics related to an urban development such as road layout, spatial distribution of urban areas and urban design features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load modelling plays an important role in power system dynamic stability assessment. One of the widely used methods in assessing load model impact on system dynamic response is parametric sensitivity analysis. A composite load model-based load sensitivity analysis framework is proposed. It enables comprehensive investigation into load modelling impacts on system stability considering the dynamic interactions between load and system dynamics. The effect of the location of individual as well as patches of composite loads in the vicinity on the sensitivity of the oscillatory modes is investigated. The impact of load composition on the overall sensitivity of the load is also investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determining the optimal of black-start strategies is very important for speeding the restoration speed of a power system after a global blackout. Most existing black-start decision-making methods are based on the assumption that all indexes are independent of each other, and little attention has been paid to the group decision-making method which is more reliable. Given this background, the intuitionistic fuzzy set and further intuitionistic fuzzy Choquet integral operator are presented, and a black-start decision-making method based on this integral operator is presented. Compared to existing methods, the proposed algorithm cannot only deal with the relevance among the indexes, but also overcome some shortcomings of the existing methods. Finally, an example is used to demonstrate the proposed method. © 2012 The Institution of Engineering and Technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide description and analysis of how a traditional industry is currently using e-learning, and to identify how the potential of e-learning can be realised whilst acknowledging the technological divide between younger and older workers. Design/methodology/approach – An exploratory qualitative methodology was employed to analyse three key questions: How is the Australian rail industry currently using e-learning? Are there age-related issues with the current use of e-learning in the rail industry? How could e-learning be used in future to engage different generations of learners in the rail industry? Data were collected in five case organisations from across the Australian rail industry. Findings – Of the rail organisations interviewed, none believed they were using e-learning to its full potential. The younger, more technologically literate employees are not having their expectations met and therefore retention of younger workers has become an issue. The challenge for learning and development practitioners is balancing the preferences of an aging workforce with these younger, more “technology-savvy”, learners and the findings highlight some potential ways to begin addressing this balance. Practical implications – The findings identified the potential for organisations (even those in a traditional industry such as rail) to better utilise e-learning to attract and retain younger workers but also warns against making assumptions about technological competency based on age. Originality/value – Data were gathered across an industry, and thus this paper takes an industry approach to considering the potential age-related issues with e-learning and the ways it may be used to meet the needs of different generations in the workplace.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Utilizing a mono-specific antiserum produced in rabbits to hog kidney aromatic L-amino acid decarboxylase (AADC), the enzyme was localized in rat kidney by immunoperoxidase staining. AADC was located predominantly in the proximal convoluted tubules; there was also weak staining in the distal convoluted tubules and collecting ducts. An increase in dietary potassium or sodium intake produced no change in density or distribution of AADC staining in kidney. An assay of AADC enzyme activity showed no difference in cortex or medulla with chronic potassium loading. A change in distribution or activity of renal AADC does not explain the postulated dopaminergic modulation of renal function that occurs with potassium or sodium loading.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Ethnic differences in body fat distribution contribute to ethnic differences in cardiovascular morbidities and diabetes. However few data are available on differences in fat distribution in Asian children from various backgrounds. Therefore, the current study aimed to explore ethnic differences in body fat distribution among Asian children from four countries. Methods A total of 758 children aged 8-10 y from China, Lebanon, Malaysia and Thailand were recruited using a non-random purposive sampling approach to enrol children encompassing a wide BMI range. Height, weight, waist circumference (WC), fat mass (FM, derived from total body water [TBW] estimation using the deuterium dilution technique) and skinfold thickness (SFT) at biceps, triceps, subscapular, supraspinale and medial calf were collected. Results After controlling for height and weight, Chinese and Thai children had a significantly higher WC than their Lebanese and Malay counterparts. Chinese and Thais tended to have higher trunk fat deposits than Lebanese and Malays reflected in trunk SFT, trunk/upper extremity ratio or supraspinale/upper extremity ratio after adjustment for age and total body fat. The subscapular/supraspinale skinfold ratio was lower in Chinese and Thais compared with Lebanese and Malays after correcting for trunk SFT. Conclusions Asian pre-pubertal children from different origins vary in body fat distribution. These results indicate the importance of population-specific WC cut-off points or other fat distribution indices to identify the population at risk of obesity-related health problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Canonical serine protease inhibitors commonly bind to their targets through a rigid loop stabilised by an internal hydrogen bond network and disulfide bond(s). The smallest of these is sunflower trypsin inhibitor (SFTI-1), a potent and broad-range protease inhibitor. Recently, we re-engineered the contact β-sheet of SFTI-1 to produce a selective inhibitor of kallikrein-related peptidase 4 (KLK4), a protease associated with prostate cancer progression. However, modifications in the binding loop to achieve specificity may compromise structural rigidity and prevent re-engineered inhibitors from reaching optimal binding affinity. Methodology/Principal Findings In this study, the effect of amino acid substitutions on the internal hydrogen bonding network of SFTI were investigated using an in silico screen of inhibitor variants in complex with KLK4 or trypsin. Substitutions favouring internal hydrogen bond formation directly correlated with increased potency of inhibition in vitro. This produced a second generation inhibitor (SFTI-FCQR Asn14) which displayed both a 125-fold increased capacity to inhibit KLK4 (Ki = 0.0386±0.0060 nM) and enhanced selectivity over off-target serine proteases. Further, SFTI-FCQR Asn14 was stable in cell culture and bioavailable in mice when administered by intraperitoneal perfusion. Conclusion/Significance These findings highlight the importance of conserving structural rigidity of the binding loop in addition to optimising protease/inhibitor contacts when re-engineering canonical serine protease inhibitors.