356 resultados para Random Number Generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the growing number of XML documents on theWeb it becomes essential to effectively organise these XML documents in order to retrieve useful information from them. A possible solution is to apply clustering on the XML documents to discover knowledge that promotes effective data management, information retrieval and query processing. However, many issues arise in discovering knowledge from these types of semi-structured documents due to their heterogeneity and structural irregularity. Most of the existing research on clustering techniques focuses only on one feature of the XML documents, this being either their structure or their content due to scalability and complexity problems. The knowledge gained in the form of clusters based on the structure or the content is not suitable for reallife datasets. It therefore becomes essential to include both the structure and content of XML documents in order to improve the accuracy and meaning of the clustering solution. However, the inclusion of both these kinds of information in the clustering process results in a huge overhead for the underlying clustering algorithm because of the high dimensionality of the data. The overall objective of this thesis is to address these issues by: (1) proposing methods to utilise frequent pattern mining techniques to reduce the dimension; (2) developing models to effectively combine the structure and content of XML documents; and (3) utilising the proposed models in clustering. This research first determines the structural similarity in the form of frequent subtrees and then uses these frequent subtrees to represent the constrained content of the XML documents in order to determine the content similarity. A clustering framework with two types of models, implicit and explicit, is developed. The implicit model uses a Vector Space Model (VSM) to combine the structure and the content information. The explicit model uses a higher order model, namely a 3- order Tensor Space Model (TSM), to explicitly combine the structure and the content information. This thesis also proposes a novel incremental technique to decompose largesized tensor models to utilise the decomposed solution for clustering the XML documents. The proposed framework and its components were extensively evaluated on several real-life datasets exhibiting extreme characteristics to understand the usefulness of the proposed framework in real-life situations. Additionally, this research evaluates the outcome of the clustering process on the collection selection problem in the information retrieval on the Wikipedia dataset. The experimental results demonstrate that the proposed frequent pattern mining and clustering methods outperform the related state-of-the-art approaches. In particular, the proposed framework of utilising frequent structures for constraining the content shows an improvement in accuracy over content-only and structure-only clustering results. The scalability evaluation experiments conducted on large scaled datasets clearly show the strengths of the proposed methods over state-of-the-art methods. In particular, this thesis work contributes to effectively combining the structure and the content of XML documents for clustering, in order to improve the accuracy of the clustering solution. In addition, it also contributes by addressing the research gaps in frequent pattern mining to generate efficient and concise frequent subtrees with various node relationships that could be used in clustering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improved scaling analysis and direct numerical simulations are performed for the unsteady natural convection boundary layer adjacent to a downward facing inclined plate with uniform heat flux. The development of the thermal or viscous boundary layers may be classified into three distinct stages: a start-up stage, a transitional stage and a steady stage, which can be clearly identified in the analytical as well as the numerical results. Previous scaling shows that the existing scaling laws of the boundary layer thickness, velocity and steady state time scale for the natural convection flow on a heated plate of uniform heat flux provide a very poor prediction of the Prandtl number dependency of the flow. However, those scalings perform very well with Rayleigh number and aspect ratio dependency. In this study, a modified Prandtl number scaling is developed using a triple layer integral approach for Pr > 1. It is seen that in comparison to the direct numerical simulations, the modified scaling performs considerably better than the previous scaling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The next-generation of service-oriented architecture (SOA) needs to scale for flexible service consumption, beyond organizational and application boundaries, into communities, ecosystems and business networks. In wider and, ultimately, global settings, new capabilities are needed so that business partners can efficiently and reliably enable, adapt and expose services. Those services can then be discovered, ordered, consumed, metered and paid for, through new applications and opportunities, driven by third-parties in the global “village”. This trend is already underway, in different ways, through different early adopter market segments. This paper proposes an architectural strategy for the provisioning and delivery of services in communities, ecosystems and business networks – a Service Delivery Framework (SDF). The SDF is intended to support multiple industries and deployments where a SOA platform is needed for collaborating partners and diverse consumers. Specifically, it is envisaged that the SDF allows providers to publish their services into network directories so that they can be repurposed, traded and consumed, and leveraging network utilities like B2B gateways and cloud hosting. To support these different facets of service delivery, the SDF extends the conventional service provider, service broker and service consumer of the Web Services Architecture to include service gateway, service hoster, service aggregator and service channel maker.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a growing area of scholarship that attests to the importance of understanding the impact of Post Traumatic Stress Disorder (PTSD) on the military family (Cozza, Chun, & Polo, 2005; Peach, 2005; Riggs, 2009; Siebler, 2003). Recent research highlights the critical role that the family plays in mitigating the effects of this condition for its members (Chase-Lansdale, Wakschlag, & Brooks-Gunn, 1995; Fiese, Foley, & Spagnola, 2006; Hetherington & Blechman, 1996; Pinkerton & Dolan, 2007; Seedat, Niehaus, & Stein, 2001; Serbin & Karp, 2003; Walsh, 2003), society (Jenson & Fraser, 2006; Seedat, Kaminer, Lockhat, & Stein, 2000; Wood & Geismar, 1989) and the next generation (Davidson & Mellor, 2001; Ender, 2006; Weber, 2005; Westerink & Giarratano, 1999). However, little is understood about the way people who grew up in Australlian military families affected by PTSD describe their experiences and what the implications are for their participation in family life. This study addressed the following research questions: (1) ‘How does a child of a Vietnam veteran understand and describe the experience of PTSD in the family?’ and (2) ‘What are the implications of this understanding on their current participation in family life?’ These questions were addressed through a qualitative analysis of focus-group data collected from adults with a Vietnam veteran parent with PTSD. The key rationale for a qualitative approach was to develop an understanding of these questions in a way which was as faithful as possible to the way they talked about their past and present family experiences. A number of experiential themes common to participants were identified through the data analysis. Participants’ experiences linked together to form a central theme of control, which revealed the overarching narrative of ‘It’s all about control and the fear of losing it’, that responds to the first research queston. The second research question led to a deeper analysis of the ‘control experiences’ to identify the ways in which participants responded to and managed these problematic aspects of family life, and the implications for their current sense of participation in family life. These responses can be understood through the overarching narrative of: ‘Soldier on despite the differences’ which assists them to optimise the impact of control and develop strategies required to maintain a semblance of personal normality and a normal family life. This intensive research has led to the development of theoretical propositions about this group’s experiences and responses that can be tested further in subsequent research to assist families and their members who may be experiencing the intergenerational impacts of psychological trauma acquired from military service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and detailed road models play an important role in a number of geospatial applications, such as infrastructure planning, traffic monitoring, and driver assistance systems. In this thesis, an integrated approach for the automatic extraction of precise road features from high resolution aerial images and LiDAR point clouds is presented. A framework of road information modeling has been proposed, for rural and urban scenarios respectively, and an integrated system has been developed to deal with road feature extraction using image and LiDAR analysis. For road extraction in rural regions, a hierarchical image analysis is first performed to maximize the exploitation of road characteristics in different resolutions. The rough locations and directions of roads are provided by the road centerlines detected in low resolution images, both of which can be further employed to facilitate the road information generation in high resolution images. The histogram thresholding method is then chosen to classify road details in high resolution images, where color space transformation is used for data preparation. After the road surface detection, anisotropic Gaussian and Gabor filters are employed to enhance road pavement markings while constraining other ground objects, such as vegetation and houses. Afterwards, pavement markings are obtained from the filtered image using the Otsu's clustering method. The final road model is generated by superimposing the lane markings on the road surfaces, where the digital terrain model (DTM) produced by LiDAR data can also be combined to obtain the 3D road model. As the extraction of roads in urban areas is greatly affected by buildings, shadows, vehicles, and parking lots, we combine high resolution aerial images and dense LiDAR data to fully exploit the precise spectral and horizontal spatial resolution of aerial images and the accurate vertical information provided by airborne LiDAR. Objectoriented image analysis methods are employed to process the feature classiffcation and road detection in aerial images. In this process, we first utilize an adaptive mean shift (MS) segmentation algorithm to segment the original images into meaningful object-oriented clusters. Then the support vector machine (SVM) algorithm is further applied on the MS segmented image to extract road objects. Road surface detected in LiDAR intensity images is taken as a mask to remove the effects of shadows and trees. In addition, normalized DSM (nDSM) obtained from LiDAR is employed to filter out other above-ground objects, such as buildings and vehicles. The proposed road extraction approaches are tested using rural and urban datasets respectively. The rural road extraction method is performed using pan-sharpened aerial images of the Bruce Highway, Gympie, Queensland. The road extraction algorithm for urban regions is tested using the datasets of Bundaberg, which combine aerial imagery and LiDAR data. Quantitative evaluation of the extracted road information for both datasets has been carried out. The experiments and the evaluation results using Gympie datasets show that more than 96% of the road surfaces and over 90% of the lane markings are accurately reconstructed, and the false alarm rates for road surfaces and lane markings are below 3% and 2% respectively. For the urban test sites of Bundaberg, more than 93% of the road surface is correctly reconstructed, and the mis-detection rate is below 10%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on a unique study of a large, random sample of business start-ups that were identified prior to the actual, commercial launch of the ventures. The purpose of this paper is two-fold. First, to present frequencies on the involvement of the Swedish population in the small business sector (particularly in start-ups of firms) and to compare these with estimates from Norway and the USA, which are based on studies using a similar research design. The authors also discuss the possible reasons for the differences that emerge between countries. Second, the characteristics of nascent entrepreneurs (i.e. individuals trying to start an independent business) are analysed and compared for sub-groups within the sample and with characteristics of business founders as they appear in theoretical accounts or retrospective empirical studies of surviving all firms. In order to get a representative sample from the working age population, respondents (n = 30,427) were randomly selected and interviewed by telephone. It was found that 2.0% of the Swedish population at the time of the interview were trying to start an independent business. Sweden had a significantly lower prevalence rate of nascent entrepreneurs compared to Norway and the USA. Nascent entrepreneurs were then compared to a control group of people not trying to start a business. The results confirmed findings from previous studies of business founders pointing to the importance of role models and the impression of self-employment obtained through these, employment status, age, education and experience. Marital status, the number of children in the household, and length of employment experience were unrelated to the probability of becoming a nascent entrepreneur. The gender of the respondent was the strongest distinguishing factor. Importantly, the results suggest that while one has a reasonably good understanding of the characteristics associated with men going into business for themselves, the type of variables investigated here have very limited ability to predict nascent entrepreneur status for women.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed generators (DGs) are defined as generators that are connected to a distribution network. The direction of the power flow and short-circuit current in a network could be changed compared with one without DGs. The conventional protective relay scheme does not meet the requirement in this emerging situation. As the number and capacity of DGs in the distribution network increase, the problem of coordinating protective relays becomes more challenging. Given this background, the protective relay coordination problem in distribution systems is investigated, with directional overcurrent relays taken as an example, and formulated as a mixed integer nonlinear programming problem. A mathematical model describing this problem is first developed, and the well-developed differential evolution algorithm is then used to solve it. Finally, a sample system is used to demonstrate the feasiblity and efficiency of the developed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a summary of what is known from social science research about the effects parents have on the donations of their children. It then goes on to summarize two on-going research projects. The first project provides estimates of the strength of the relationship between the charitable giving of parents and that of their adult children. The second provides estimates of the effect of inheritances on charitable donations. Both projects use data from the Center on Philanthropy Panel Study (COPPS); accordingly, the paper provides an introduction to these data. Finally, the paper draws implications for fundraisers from the two on-going projects, and suggests several other areas in which COPPS can generate knowledge to improve the practice of fundraising.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ninth release of the Toolbox, represents over fifteen years of development and a substantial level of maturity. This version captures a large number of changes and extensions generated over the last two years which support my new book “Robotics, Vision & Control”. The Toolbox has always provided many functions that are useful for the study and simulation of classical arm-type robotics, for example such things as kinematics, dynamics, and trajectory generation. The Toolbox is based on a very general method of representing the kinematics and dynamics of serial-link manipulators. These parameters are encapsulated in MATLAB ® objects - robot objects can be created by the user for any serial-link manipulator and a number of examples are provided for well know robots such as the Puma 560 and the Stanford arm amongst others. The Toolbox also provides functions for manipulating and converting between datatypes such as vectors, homogeneous transformations and unit-quaternions which are necessary to represent 3-dimensional position and orientation. This ninth release of the Toolbox has been significantly extended to support mobile robots. For ground robots the Toolbox includes standard path planning algorithms (bug, distance transform, D*, PRM), kinodynamic planning (RRT), localization (EKF, particle filter), map building (EKF) and simultaneous localization and mapping (EKF), and a Simulink model a of non-holonomic vehicle. The Toolbox also including a detailed Simulink model for a quadcopter flying robot.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban stormwater quality is multifaceted and the use of a limited number of factors to represent catchment characteristics may not be adequate to explain the complexity of water quality response to a rainfall event or site-to-site differences in stormwater quality modelling. This paper presents the outcomes of a research study which investigated the adequacy of using land use and impervious area fraction only, to represent catchment characteristics in urban stormwater quality modelling. The research outcomes confirmed the inadequacy of the use of these two parameters alone to represent urban catchment characteristics in stormwater quality prediction. Urban form also needs to be taken into consideration as it was found have an important impact on stormwater quality by influencing pollutant generation, build-up and wash-off. Urban form refers to characteristics related to an urban development such as road layout, spatial distribution of urban areas and urban design features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current investigation reports on diesel particulate matter emissions, with special interest in fine particles from the combustion of two base fuels. The base fuels selected were diesel fuel and marine gas oil (MGO). The experiments were conducted with a four-stroke, six-cylinder, direct injection diesel engine. The results showed that the fine particle number emissions measured by both SMPS and ELPI were higher with MGO compared to diesel fuel. It was observed that the fine particle number emissions with the two base fuels were quantitatively different but qualitatively similar. The gravimetric (mass basis) measurement also showed higher total particulate matter (TPM) emissions with the MGO. The smoke emissions, which were part of TPM, were also higher for the MGO. No significant changes in the mass flow rate of fuel and the brake-specific fuel consumption (BSFC) were observed between the two base fuels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unlicensed driving remains a serious problem in many jurisdictions, and while it does not play a direct causative role in road crashes, it undermines driver licensing systems and is linked to other high risk driving behaviours. Roadside licence check surveys represent the most direct means of estimating the prevalence of unlicensed driving. The current study involved the Queensland Police Service (QPS) checking the licences of 3,112 drivers intercepted at random breath testing operations across Queensland between February and April 2010. Data was matched with official licensing records from Transport and Main Roads (TMR) via the drivers’ licence number. In total, 2,914 (93.6%) records were matched, with the majority of the 198 unmatched cases representing international or interstate licence holders (n = 156), leaving 42 unknown cases. Among the drivers intercepted at the roadside, 20 (0.6%) were identified as being unlicensed at the time, while a further 11 (0.4%) were driving unaccompanied on a Learner Licence. However, the examination of TMR licensing records revealed that an additional 9 individuals (0.3%) had a current licence sanction but were not identified as unlicensed by QPS. Thus, in total 29 of the drivers were unlicensed at the time, representing 0.9% of all the drivers intercepted and 1% of those for whom their licence records could be checked. This is considerably lower than the involvement of unlicensed drivers in fatal and serious injury crashes in Queensland, which is consistent with other research confirming the increased crash risk of the group. However, the number of unmatched records suggest that it is possible the on-road survey may have under-estimated the prevalence of unlicensed driving, so further development of the survey method is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ubiquitin (Ub)-proteasome pathway is the major nonlysosomal pathway of proteolysis in human cells and accounts for the degradation of most short-lived, misfolded or damaged proteins. This pathway is important in the regulation of a number of key biological regulatory mechanisms. Proteins are usually targeted for proteasome-mediated degradation by polyubiquitinylation, the covalent addition of multiple units of the 76 amino acid protein Ub, which are ligated to 1-amino groups of lysine residues in the substrate. Polyubiquitinylated proteins are degraded by the 26S proteasome, a large, ATP-dependent multicatalytic protease complex, which also regenerates monomeric Ub. The targets of this pathway include key regulators of cell proliferation and cell death. An alternative form of the proteasome, termed the immunoproteasome, also has important functions in the generation of peptides for presentation by MHC class I molecules. In recent years there has been a great deal of interest in the possibility that proteasome inhibitors, through elevation of the levels of proteasome targets, might prove useful as a novel class of anti-cancer drugs. Here we review the progress made to date in this area and highlight the potential advantages and weaknesses of this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.