908 resultados para random number generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a growing area of scholarship that attests to the importance of understanding the impact of Post Traumatic Stress Disorder (PTSD) on the military family (Cozza, Chun, & Polo, 2005; Peach, 2005; Riggs, 2009; Siebler, 2003). Recent research highlights the critical role that the family plays in mitigating the effects of this condition for its members (Chase-Lansdale, Wakschlag, & Brooks-Gunn, 1995; Fiese, Foley, & Spagnola, 2006; Hetherington & Blechman, 1996; Pinkerton & Dolan, 2007; Seedat, Niehaus, & Stein, 2001; Serbin & Karp, 2003; Walsh, 2003), society (Jenson & Fraser, 2006; Seedat, Kaminer, Lockhat, & Stein, 2000; Wood & Geismar, 1989) and the next generation (Davidson & Mellor, 2001; Ender, 2006; Weber, 2005; Westerink & Giarratano, 1999). However, little is understood about the way people who grew up in Australlian military families affected by PTSD describe their experiences and what the implications are for their participation in family life. This study addressed the following research questions: (1) ‘How does a child of a Vietnam veteran understand and describe the experience of PTSD in the family?’ and (2) ‘What are the implications of this understanding on their current participation in family life?’ These questions were addressed through a qualitative analysis of focus-group data collected from adults with a Vietnam veteran parent with PTSD. The key rationale for a qualitative approach was to develop an understanding of these questions in a way which was as faithful as possible to the way they talked about their past and present family experiences. A number of experiential themes common to participants were identified through the data analysis. Participants’ experiences linked together to form a central theme of control, which revealed the overarching narrative of ‘It’s all about control and the fear of losing it’, that responds to the first research queston. The second research question led to a deeper analysis of the ‘control experiences’ to identify the ways in which participants responded to and managed these problematic aspects of family life, and the implications for their current sense of participation in family life. These responses can be understood through the overarching narrative of: ‘Soldier on despite the differences’ which assists them to optimise the impact of control and develop strategies required to maintain a semblance of personal normality and a normal family life. This intensive research has led to the development of theoretical propositions about this group’s experiences and responses that can be tested further in subsequent research to assist families and their members who may be experiencing the intergenerational impacts of psychological trauma acquired from military service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and detailed road models play an important role in a number of geospatial applications, such as infrastructure planning, traffic monitoring, and driver assistance systems. In this thesis, an integrated approach for the automatic extraction of precise road features from high resolution aerial images and LiDAR point clouds is presented. A framework of road information modeling has been proposed, for rural and urban scenarios respectively, and an integrated system has been developed to deal with road feature extraction using image and LiDAR analysis. For road extraction in rural regions, a hierarchical image analysis is first performed to maximize the exploitation of road characteristics in different resolutions. The rough locations and directions of roads are provided by the road centerlines detected in low resolution images, both of which can be further employed to facilitate the road information generation in high resolution images. The histogram thresholding method is then chosen to classify road details in high resolution images, where color space transformation is used for data preparation. After the road surface detection, anisotropic Gaussian and Gabor filters are employed to enhance road pavement markings while constraining other ground objects, such as vegetation and houses. Afterwards, pavement markings are obtained from the filtered image using the Otsu's clustering method. The final road model is generated by superimposing the lane markings on the road surfaces, where the digital terrain model (DTM) produced by LiDAR data can also be combined to obtain the 3D road model. As the extraction of roads in urban areas is greatly affected by buildings, shadows, vehicles, and parking lots, we combine high resolution aerial images and dense LiDAR data to fully exploit the precise spectral and horizontal spatial resolution of aerial images and the accurate vertical information provided by airborne LiDAR. Objectoriented image analysis methods are employed to process the feature classiffcation and road detection in aerial images. In this process, we first utilize an adaptive mean shift (MS) segmentation algorithm to segment the original images into meaningful object-oriented clusters. Then the support vector machine (SVM) algorithm is further applied on the MS segmented image to extract road objects. Road surface detected in LiDAR intensity images is taken as a mask to remove the effects of shadows and trees. In addition, normalized DSM (nDSM) obtained from LiDAR is employed to filter out other above-ground objects, such as buildings and vehicles. The proposed road extraction approaches are tested using rural and urban datasets respectively. The rural road extraction method is performed using pan-sharpened aerial images of the Bruce Highway, Gympie, Queensland. The road extraction algorithm for urban regions is tested using the datasets of Bundaberg, which combine aerial imagery and LiDAR data. Quantitative evaluation of the extracted road information for both datasets has been carried out. The experiments and the evaluation results using Gympie datasets show that more than 96% of the road surfaces and over 90% of the lane markings are accurately reconstructed, and the false alarm rates for road surfaces and lane markings are below 3% and 2% respectively. For the urban test sites of Bundaberg, more than 93% of the road surface is correctly reconstructed, and the mis-detection rate is below 10%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on a unique study of a large, random sample of business start-ups that were identified prior to the actual, commercial launch of the ventures. The purpose of this paper is two-fold. First, to present frequencies on the involvement of the Swedish population in the small business sector (particularly in start-ups of firms) and to compare these with estimates from Norway and the USA, which are based on studies using a similar research design. The authors also discuss the possible reasons for the differences that emerge between countries. Second, the characteristics of nascent entrepreneurs (i.e. individuals trying to start an independent business) are analysed and compared for sub-groups within the sample and with characteristics of business founders as they appear in theoretical accounts or retrospective empirical studies of surviving all firms. In order to get a representative sample from the working age population, respondents (n = 30,427) were randomly selected and interviewed by telephone. It was found that 2.0% of the Swedish population at the time of the interview were trying to start an independent business. Sweden had a significantly lower prevalence rate of nascent entrepreneurs compared to Norway and the USA. Nascent entrepreneurs were then compared to a control group of people not trying to start a business. The results confirmed findings from previous studies of business founders pointing to the importance of role models and the impression of self-employment obtained through these, employment status, age, education and experience. Marital status, the number of children in the household, and length of employment experience were unrelated to the probability of becoming a nascent entrepreneur. The gender of the respondent was the strongest distinguishing factor. Importantly, the results suggest that while one has a reasonably good understanding of the characteristics associated with men going into business for themselves, the type of variables investigated here have very limited ability to predict nascent entrepreneur status for women.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed generators (DGs) are defined as generators that are connected to a distribution network. The direction of the power flow and short-circuit current in a network could be changed compared with one without DGs. The conventional protective relay scheme does not meet the requirement in this emerging situation. As the number and capacity of DGs in the distribution network increase, the problem of coordinating protective relays becomes more challenging. Given this background, the protective relay coordination problem in distribution systems is investigated, with directional overcurrent relays taken as an example, and formulated as a mixed integer nonlinear programming problem. A mathematical model describing this problem is first developed, and the well-developed differential evolution algorithm is then used to solve it. Finally, a sample system is used to demonstrate the feasiblity and efficiency of the developed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a summary of what is known from social science research about the effects parents have on the donations of their children. It then goes on to summarize two on-going research projects. The first project provides estimates of the strength of the relationship between the charitable giving of parents and that of their adult children. The second provides estimates of the effect of inheritances on charitable donations. Both projects use data from the Center on Philanthropy Panel Study (COPPS); accordingly, the paper provides an introduction to these data. Finally, the paper draws implications for fundraisers from the two on-going projects, and suggests several other areas in which COPPS can generate knowledge to improve the practice of fundraising.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ninth release of the Toolbox, represents over fifteen years of development and a substantial level of maturity. This version captures a large number of changes and extensions generated over the last two years which support my new book “Robotics, Vision & Control”. The Toolbox has always provided many functions that are useful for the study and simulation of classical arm-type robotics, for example such things as kinematics, dynamics, and trajectory generation. The Toolbox is based on a very general method of representing the kinematics and dynamics of serial-link manipulators. These parameters are encapsulated in MATLAB ® objects - robot objects can be created by the user for any serial-link manipulator and a number of examples are provided for well know robots such as the Puma 560 and the Stanford arm amongst others. The Toolbox also provides functions for manipulating and converting between datatypes such as vectors, homogeneous transformations and unit-quaternions which are necessary to represent 3-dimensional position and orientation. This ninth release of the Toolbox has been significantly extended to support mobile robots. For ground robots the Toolbox includes standard path planning algorithms (bug, distance transform, D*, PRM), kinodynamic planning (RRT), localization (EKF, particle filter), map building (EKF) and simultaneous localization and mapping (EKF), and a Simulink model a of non-holonomic vehicle. The Toolbox also including a detailed Simulink model for a quadcopter flying robot.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban stormwater quality is multifaceted and the use of a limited number of factors to represent catchment characteristics may not be adequate to explain the complexity of water quality response to a rainfall event or site-to-site differences in stormwater quality modelling. This paper presents the outcomes of a research study which investigated the adequacy of using land use and impervious area fraction only, to represent catchment characteristics in urban stormwater quality modelling. The research outcomes confirmed the inadequacy of the use of these two parameters alone to represent urban catchment characteristics in stormwater quality prediction. Urban form also needs to be taken into consideration as it was found have an important impact on stormwater quality by influencing pollutant generation, build-up and wash-off. Urban form refers to characteristics related to an urban development such as road layout, spatial distribution of urban areas and urban design features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current investigation reports on diesel particulate matter emissions, with special interest in fine particles from the combustion of two base fuels. The base fuels selected were diesel fuel and marine gas oil (MGO). The experiments were conducted with a four-stroke, six-cylinder, direct injection diesel engine. The results showed that the fine particle number emissions measured by both SMPS and ELPI were higher with MGO compared to diesel fuel. It was observed that the fine particle number emissions with the two base fuels were quantitatively different but qualitatively similar. The gravimetric (mass basis) measurement also showed higher total particulate matter (TPM) emissions with the MGO. The smoke emissions, which were part of TPM, were also higher for the MGO. No significant changes in the mass flow rate of fuel and the brake-specific fuel consumption (BSFC) were observed between the two base fuels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unlicensed driving remains a serious problem in many jurisdictions, and while it does not play a direct causative role in road crashes, it undermines driver licensing systems and is linked to other high risk driving behaviours. Roadside licence check surveys represent the most direct means of estimating the prevalence of unlicensed driving. The current study involved the Queensland Police Service (QPS) checking the licences of 3,112 drivers intercepted at random breath testing operations across Queensland between February and April 2010. Data was matched with official licensing records from Transport and Main Roads (TMR) via the drivers’ licence number. In total, 2,914 (93.6%) records were matched, with the majority of the 198 unmatched cases representing international or interstate licence holders (n = 156), leaving 42 unknown cases. Among the drivers intercepted at the roadside, 20 (0.6%) were identified as being unlicensed at the time, while a further 11 (0.4%) were driving unaccompanied on a Learner Licence. However, the examination of TMR licensing records revealed that an additional 9 individuals (0.3%) had a current licence sanction but were not identified as unlicensed by QPS. Thus, in total 29 of the drivers were unlicensed at the time, representing 0.9% of all the drivers intercepted and 1% of those for whom their licence records could be checked. This is considerably lower than the involvement of unlicensed drivers in fatal and serious injury crashes in Queensland, which is consistent with other research confirming the increased crash risk of the group. However, the number of unmatched records suggest that it is possible the on-road survey may have under-estimated the prevalence of unlicensed driving, so further development of the survey method is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ubiquitin (Ub)-proteasome pathway is the major nonlysosomal pathway of proteolysis in human cells and accounts for the degradation of most short-lived, misfolded or damaged proteins. This pathway is important in the regulation of a number of key biological regulatory mechanisms. Proteins are usually targeted for proteasome-mediated degradation by polyubiquitinylation, the covalent addition of multiple units of the 76 amino acid protein Ub, which are ligated to 1-amino groups of lysine residues in the substrate. Polyubiquitinylated proteins are degraded by the 26S proteasome, a large, ATP-dependent multicatalytic protease complex, which also regenerates monomeric Ub. The targets of this pathway include key regulators of cell proliferation and cell death. An alternative form of the proteasome, termed the immunoproteasome, also has important functions in the generation of peptides for presentation by MHC class I molecules. In recent years there has been a great deal of interest in the possibility that proteasome inhibitors, through elevation of the levels of proteasome targets, might prove useful as a novel class of anti-cancer drugs. Here we review the progress made to date in this area and highlight the potential advantages and weaknesses of this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusion techniques have received considerable attention for achieving performance improvement with biometrics. While a multi-sample fusion architecture reduces false rejects, it also increases false accepts. This impact on performance also depends on the nature of subsequent attempts, i.e., random or adaptive. Expressions for error rates are presented and experimentally evaluated in this work by considering the multi-sample fusion architecture for text-dependent speaker verification using HMM based digit dependent speaker models. Analysis incorporating correlation modeling demonstrates that the use of adaptive samples improves overall fusion performance compared to randomly repeated samples. For a text dependent speaker verification system using digit strings, sequential decision fusion of seven instances with three random samples is shown to reduce the overall error of the verification system by 26% which can be further reduced by 6% for adaptive samples. This analysis novel in its treatment of random and adaptive multiple presentations within a sequential fused decision architecture, is also applicable to other biometric modalities such as finger prints and handwriting samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Cooperative Collision Warning System (CCWS) is an active safety techno- logy for road vehicles that can potentially reduce traffic accidents. It provides a driver with situational awareness and early warnings of any possible colli- sions through an on-board unit. CCWS is still under active research, and one of the important technical problems is safety message dissemination. Safety messages are disseminated in a high-speed mobile environment using wireless communication technology such as Dedicated Short Range Communication (DSRC). The wireless communication in CCWS has a limited bandwidth and can become unreliable when used inefficiently, particularly given the dynamic nature of road traffic conditions. Unreliable communication may significantly reduce the performance of CCWS in preventing collisions. There are two types of safety messages: Routine Safety Messages (RSMs) and Event Safety Messages (ESMs). An RSM contains the up-to-date state of a vehicle, and it must be disseminated repeatedly to its neighbouring vehicles. An ESM is a warning message that must be sent to all the endangered vehi- cles. Existing RSM and ESM dissemination schemes are inefficient, unscalable, and unable to give priority to vehicles in the most danger. Thus, this study investigates more efficient and scalable RSM and ESM dissemination schemes that can make use of the context information generated from a particular traffic scenario. Therefore, this study tackles three technical research prob- lems, vehicular traffic scenario modelling and context information generation, context-aware RSM dissemination, and context-aware ESM dissemination. The most relevant context information in CCWS is the information about possible collisions among vehicles given a current vehicular traffic situation. To generate the context information, this study investigates techniques to model interactions among multiple vehicles based on their up-to-date motion state obtained via RSM. To date, there is no existing model that can represent interactions among multiple vehicles in a speciffic region and at a particular time. The major outcome from the first problem is a new interaction graph model that can be used to easily identify the endangered vehicles and their danger severity. By identifying the endangered vehicles, RSM and ESM dis- semination can be optimised while improving safety at the same time. The new model enables the development of context-aware RSM and ESM dissemination schemes. To disseminate RSM efficiently, this study investigates a context-aware dis- semination scheme that can optimise the RSM dissemination rate to improve safety in various vehicle densities. The major outcome from the second problem is a context-aware RSM dissemination protocol. The context-aware protocol can adaptively adjust the dissemination rate based on an estimated channel load and danger severity of vehicle interactions given by the interaction graph model. Unlike existing RSM dissemination schemes, the proposed adaptive scheme can reduce channel congestion and improve safety by prioritising ve- hicles that are most likely to crash with other vehicles. The proposed RSM protocol has been implemented and evaluated by simulation. The simulation results have shown that the proposed RSM protocol outperforms existing pro- tocols in terms of efficiency, scalability and safety. To disseminate ESM efficiently, this study investigates a context-aware ESM dissemination scheme that can reduce unnecessary transmissions and deliver ESMs to endangered vehicles as fast as possible. The major outcome from the third problem is a context-aware ESM dissemination protocol that uses a multicast routing strategy. Existing ESM protocols use broadcast rout- ing, which is not efficient because ESMs may be sent to a large number of ve- hicles in the area. Using multicast routing improves efficiency because ESMs are sent only to the endangered vehicles. The endangered vehicles can be identified using the interaction graph model. The proposed ESM protocol has been implemented and evaluated by simulation. The simulation results have shown that the proposed ESM protocol can prevent potential accidents from occurring better than existing ESM protocols. The context model and the RSM and ESM dissemination protocols can be implemented in any CCWS development to improve the communication and safety performance of CCWS. In effect, the outcomes contribute to the realisation of CCWS that will ultimately improve road safety and save lives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Voltage drop and rise at network peak and off–peak periods along with voltage unbalance are the major power quality problems in low voltage distribution networks. Usually, the utilities try to use adjusting the transformer tap changers as a solution for the voltage drop. They also try to distribute the loads equally as a solution for network voltage unbalance problem. On the other hand, the ever increasing energy demand, along with the necessity of cost reduction and higher reliability requirements, are driving the modern power systems towards Distributed Generation (DG) units. This can be in the form of small rooftop photovoltaic cells (PV), Plug–in Electric Vehicles (PEVs) or Micro Grids (MGs). Rooftop PVs, typically with power levels ranging from 1–5 kW installed by the householders are gaining popularity due to their financial benefits for the householders. Also PEVs will be soon emerged in residential distribution networks which behave as a huge residential load when they are being charged while in their later generation, they are also expected to support the network as small DG units which transfer the energy stored in their battery into grid. Furthermore, the MG which is a cluster of loads and several DG units such as diesel generators, PVs, fuel cells and batteries are recently introduced to distribution networks. The voltage unbalance in the network can be increased due to the uncertainties in the random connection point of the PVs and PEVs to the network, their nominal capacity and time of operation. Therefore, it is of high interest to investigate the voltage unbalance in these networks as the result of MGs, PVs and PEVs integration to low voltage networks. In addition, the network might experience non–standard voltage drop due to high penetration of PEVs, being charged at night periods, or non–standard voltage rise due to high penetration of PVs and PEVs generating electricity back into the grid in the network off–peak periods. In this thesis, a voltage unbalance sensitivity analysis and stochastic evaluation is carried out for PVs installed by the householders versus their installation point, their nominal capacity and penetration level as different uncertainties. A similar analysis is carried out for PEVs penetration in the network working in two different modes: Grid to vehicle and Vehicle to grid. Furthermore, the conventional methods are discussed for improving the voltage unbalance within these networks. This is later continued by proposing new and efficient improvement methods for voltage profile improvement at network peak and off–peak periods and voltage unbalance reduction. In addition, voltage unbalance reduction is investigated for MGs and new improvement methods are proposed and applied for the MG test bed, planned to be established at Queensland University of Technology (QUT). MATLAB and PSCAD/EMTDC simulation softwares are used for verification of the analyses and the proposals.