908 resultados para random number generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mesenchymal stem cells (MSCs) are undifferentiated, multi-potent stem cells with the ability to renew. They can differentiate into many types of terminal cells, such as osteoblasts, chondrocytes, adipocytes, myocytes, and neurons. These cells have been applied in tissue engineering as the main cell type to regenerate new tissues. However, a number of issues remain concerning the use of MSCs, such as cell surface markers, the determining factors responsible for their differentiation to terminal cells, and the mechanisms whereby growth factors stimulate MSCs. In this chapter, we will discuss how proteomic techniques have contributed to our current knowledge and how they can be used to address issues currently facing MSC research. The application of proteomics has led to the identification of a special pattern of cell surface protein expression of MSCs. The technique has also contributed to the study of a regulatory network of MSC differentiation to terminal differentiated cells, including osteocytes, chondrocytes, adipocytes, neurons, cardiomyocytes, hepatocytes, and pancreatic islet cells. It has also helped elucidate mechanisms for growth factor–stimulated differentiation of MSCs. Proteomics can, however, not reveal the accurate role of a special pathway and must therefore be combined with other approaches for this purpose. A new generation of proteomic techniques have recently been developed, which will enable a more comprehensive study of MSCs. Keywords

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; online collaboration, manual, automatic and human-in-the loop analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Charge of the light brigade: A molecule is able to walk back and forth upon a five-foothold pentaethylenimine track without external intervention. The 1D random walk is highly processive (mean step number 530) and exchange takes place between adjacent amine groups in a stepwise fashion. The walker performs a simple task whilst walking: quenching of the fluorescence of an anthracene group sited at one end of the track. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the continued development of renewable energy generation technologies and increasing pressure to combat the global effects of greenhouse warming, plug-in hybrid electric vehicles (PHEVs) have received worldwide attention, finding applications in North America and Europe. When a large number of PHEVs are introduced into a power system, there will be extensive impacts on power system planning and operation, as well as on electricity market development. It is therefore necessary to properly control PHEV charging and discharging behaviors. Given this background, a new unit commitment model and its solution method that takes into account the optimal PHEV charging and discharging controls is presented in this paper. A 10-unit and 24-hour unit commitment (UC) problem is employed to demonstrate the feasibility and efficiency of the developed method, and the impacts of the wide applications of PHEVs on the operating costs and the emission of the power system are studied. Case studies are also carried out to investigate the impacts of different PHEV penetration levels and different PHEV charging modes on the results of the UC problem. A 100-unit system is employed for further analysis on the impacts of PHEVs on the UC problem in a larger system application. Simulation results demonstrate that the employment of optimized PHEV charging and discharging modes is very helpful for smoothing the load curve profile and enhancing the ability of the power system to accommodate more PHEVs. Furthermore, an optimal Vehicle to Grid (V2G) discharging control provides economic and efficient backups and spinning reserves for the secure and economic operation of the power system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide description and analysis of how a traditional industry is currently using e-learning, and to identify how the potential of e-learning can be realised whilst acknowledging the technological divide between younger and older workers. Design/methodology/approach – An exploratory qualitative methodology was employed to analyse three key questions: How is the Australian rail industry currently using e-learning? Are there age-related issues with the current use of e-learning in the rail industry? How could e-learning be used in future to engage different generations of learners in the rail industry? Data were collected in five case organisations from across the Australian rail industry. Findings – Of the rail organisations interviewed, none believed they were using e-learning to its full potential. The younger, more technologically literate employees are not having their expectations met and therefore retention of younger workers has become an issue. The challenge for learning and development practitioners is balancing the preferences of an aging workforce with these younger, more “technology-savvy”, learners and the findings highlight some potential ways to begin addressing this balance. Practical implications – The findings identified the potential for organisations (even those in a traditional industry such as rail) to better utilise e-learning to attract and retain younger workers but also warns against making assumptions about technological competency based on age. Originality/value – Data were gathered across an industry, and thus this paper takes an industry approach to considering the potential age-related issues with e-learning and the ways it may be used to meet the needs of different generations in the workplace.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An elevated particle number concentration (PNC) observed during nucleation events could play a significant contribution to the total particle load and therefore to the air pollution in the urban environments. Therefore, a field measurement study of PNC was commenced to investigate the temporal and spatial variations of PNC within the urban airshed of Brisbane, Australia. PNC was monitored at urban (QUT), roadside (WOO) and semi-urban (ROC) areas around the Brisbane region during 2009. During the morning traffic peak period, the highest relative fraction of PNC reached about 5% at QUT and WOO on weekdays. PNC peaks were observed around noon, which correlated with the highest solar radiation levels at all three stations, thus suggesting that high PNC levels were likely to be associated with new particle formation caused by photochemical reactions. Wind rose plots showed relatively higher PNC for the NE direction, which was associated with industrial pollution, accounting for 12%, 9% and 14% of overall PNC at QUT, WOO and ROC, respectively. Although there was no significant correlation between PNC at each station, the variation of PNC was well correlated among three stations during regional nucleation events. In addition, PNC at ROC was significantly influenced by upwind urban pollution during the nucleation burst events, with the average enrichment factor of 15.4. This study provides an insight into the influence of regional nucleation events on PNC in the Brisbane region and it the first study to quantify the effect of urban pollution on semi-urban PNC through the nucleation events. © 2012 Author(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is found in the literature that the existing scaling results for the boundary layer thickness, velocity and steady state time for the natural convection flow over an evenly heated plate provide a very poor prediction of the Prandtl number dependency of the flow. However, those scalings provide a good prediction of two other governing parameters’ dependency, the Rayleigh number and the aspect ratio. Therefore, an improved scaling analysis using a triple-layer integral approach and direct numerical simulations have been performed for the natural convection boundary layer along a semi-infinite flat plate with uniform surface heat flux. This heat flux is a ramp function of time, where the temperature gradient on the surface increases with time up to some specific time and then remains constant. The growth of the boundary layer strongly depends on the ramp time. If the ramp time is sufficiently long, the boundary layer reaches a quasi steady mode before the growth of the temperature gradient is completed. In this mode, the thermal boundary layer at first grows in thickness and then contracts with increasing time. However, if the ramp time is sufficiently short, the boundary layer develops differently, but after the wall temperature gradient growth is completed, the boundary layer develops as though the startup had been instantaneous.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Client puzzles are cryptographic problems that are neither easy nor hard to solve. Most puzzles are based on either number theoretic or hash inversions problems. Hash-based puzzles are very efficient but so far have been shown secure only in the random oracle model; number theoretic puzzles, while secure in the standard model, tend to be inefficient. In this paper, we solve the problem of constucting cryptographic puzzles that are secure int he standard model and are very efficient. We present an efficient number theoretic puzzle that satisfies the puzzle security definition of Chen et al. (ASIACRYPT 2009). To prove the security of our puzzle, we introduce a new variant of the interval discrete logarithm assumption which may be of independent interest, and show this new problem to be hard under reasonable assumptions. Our experimental results show that, for 512-bit modulus, the solution verification time of our proposed puzzle can be up to 50x and 89x faster than the Karame-Capkum puzzle and the Rivest et al.'s time-lock puzzle respectively. In particular, the solution verification tiem of our puzzle is only 1.4x slower than that of Chen et al.'s efficient hash based puzzle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The potential of multiple distribution static synchronous compensators (DSTATCOMs) to improve the voltage profile of radial distribution networks has been reported in the literature by few authors. However, the operation of multiple DSTATCOMs across a distribution feeder may introduce control interactions and/or voltage instability. This study proposes a control scheme that alleviates interactions among controllers and enhances proper reactive power sharing among DSTATCOMs. A generalised mathematical model is presented to analyse the interactions among any number of DSTATCOMs in the network. The criterion for controller design is developed by conducting eigenvalue analysis on this mathematical model. The proposed control scheme is tested in time domain on a sample radial distribution feeder installed with multiple DSTATCOMs and test results are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The return of emotions to debates about crime and criminal justice has been a striking development of recent decades across many jurisdictions. This has been registered in the return of shame to justice procedures, a heightened focus on victims and their emotional needs, fear of crime as a major preoccupation of citizens and politicians, and highly emotionalised public discourses on crime and justice. But how can we best make sense of these developments? Do we need to create "emotionally intelligent" justice systems, or are we messing recklessly with the rational foundations of liberal criminal justice? This volume brings together leading criminologists and sociologists from across the world in a much needed conversation about how to re-calibrate reason and emotion in crime and justice today. The contributions range from the micro-analysis of emotions in violent encounters to the paradoxes and tensions that arise from the emotionalisation of criminal justice in the public sphere. They explore the emotional labour of workers in police and penal institutions, the justice experiences of victims and offenders, and the role of vengeance, forgiveness and regret in the aftermath of violence and conflict resolution. The result is a set of original essays which offer a fresh and timely perspective on problems of crime and justice in contemporary liberal democracies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new scaling analysis has been performed for the unsteady natural convection boundary layer under a downward facing inclined plate with uniform heat flux. The development of the thermal or viscous boundary layers may be classified into three distinct stages including an early stage, a transitional stage and a steady stage, which can be clearly identified in the analytical as well as numerical results. Earlier scaling shows that the existing scaling laws of the boundary layer thickness, velocity and steady state time scales for the natural convection flow on a heated plate of uniform heat flux provide a very poor prediction of the Prandtl number dependency. However, those scalings performed very well with Rayleigh number and aspect ratio dependency. In this study, a modifed Prandtl number scaling has been developed using a triple-layer integral approach for Pr > 1. It is seen that in comparison to the direct numerical simulations, the new scaling performs considerably better than the previous scaling.