795 resultados para Consecutive Analysis of Variants
Resumo:
Many interacting factors contribute to a student's choice of a university. This study takes a systems perspective of the choice and develops a Bayesian Network to represent and quantify these factors and their interactions. The systems model is illustrated through a small study of traditional school leavers in Australia, and highlights similarities and differences between universities' perceptions of student choices, students' perceptions of factors that they should consider and how students really make choices. The study shows the range of information that can be gained from this approach, including identification of important factors and scenario assessment.
Resumo:
This paper presents an analytical method to analyze the effect of X to R ratio as well as impedance value of branches on observability of a network based on un-decoupled formulation of state estimation (SE) and null space of matrices. The results showed that the X to R ratio of branches had no effect on the observability of networks. In addition, it was shown that observability of some networks was affected by impedance values while some others were not affected. In addition, for branch observability analysis of radial network, a simple and quick method is developed. Illustrative examples of the network under transmission and distribution voltages demonstrate the effectiveness of the proposed methods.
Resumo:
In urban scholarship Master Planned Estates (MPEs) are viewed as illustrative of broader changes to the urban environment and characterised as homogenous, affluent enclaves where community life is largely orchestrated by the developer. Yet no study has fully considered if, and to what extent, MPEs can be distinguished from other suburb types in terms of their residential composition and their levels of sociability and community attachment. In this article, we empirically test if MPEs are different from ‘conventional’ suburbs by examining them structurally in terms of their demographic and socio-economic characteristics, as well as in terms of their key community social processes. Using data from a 2008 study of 148 suburbs across Brisbane, Australia (which includes data from two MPEs), we undertake a comparative analysis of suburbs and examine the density of neighbour networks, residents' reports of place attachment and cohesion and neighbourly contact in MPEs compared to other residential suburbs. Our findings suggest that MPEs are not distinct in terms of their degree of homogeneity and socio-economic characteristics, but that connections among residents are lower than other suburbs despite—or perhaps because of—the active interventions of the developer.
Resumo:
Violence in entertainment districts is a major problem across urban landscapes throughout the world. Research shows that licensed premises are the third most common location for homicides and serious assaults, accounting for one in ten fatal and nonfatal assaults. One class of interventions that aims to reduce violence in entertainment districts involves the use of civil remedies: a group of strategies that use civil or regulatory measures as legal “levers” to reduce problem behavior. One specific civil remedy used to reduce problematic behavior in entertainment districts involves manipulation of licensed premise trading hours. This article uses generalized linear models to analyze the impact of lockout legislation on recorded violent offences in two entertainment districts in the Australian state of Queensland. Our research shows that 3 a.m. lockout legislation led to a direct and significant reduction in the number of violent incidents inside licensed premises. Indeed, the lockouts cut the level of violent crime inside licensed premises by half. Despite these impressive results for the control of violence inside licensed premises, we found no evidence that the lockout had any impact on violence on streets and footpaths outside licensed premises that were the site for more than 80 percent of entertainment district violence. Overall, however, our analysis suggests that lockouts are an important mechanism that helps to control the level of violence inside licensed premises but that finely grained contextual responses to alcohol-related problems are needed rather than one-size-fits-all solutions.
Resumo:
The study of international news flows has been a dominant topic of international communication research during the past 50 years. This paper critically reviews past approaches to the analysis of news flows and identifies the main strands of research in this field. In line with some previous critiques of the field, we argue that past research has for too long been influenced by dichotomous debates that failed to take account of the complexities of international news decisions. A new direction is needed in order for news flow research to provide better answers to the recurring questions. This new direction is not a break from past approaches but rather an integration of all different approaches, which would provide researchers with a more holistic framework for analyzing international news flows. This new approach calls for a combination of political, economic, geographic, historical, social and cultural factors, including perspectives from other disciplines, such as anthropology and linguistics.
Resumo:
Mass spectrometry is now an indispensable tool for lipid analysis and is arguably the driving force in the renaissance of lipid research. In its various forms, mass spectrometry is uniquely capable of resolving the extensive compositional and structural diversity of lipids in biological systems. Furthermore, it provides the ability to accurately quantify molecular-level changes in lipid populations associated with changes in metabolism and environment; bringing lipid science to the "omics" age. The recent explosion of mass spectrometry-based surface analysis techniques is fuelling further expansion of the lipidomics field. This is evidenced by the numerous papers published on the subject of mass spectrometric imaging of lipids in recent years. While imaging mass spectrometry provides new and exciting possibilities, it is but one of the many opportunities direct surface analysis offers the lipid researcher. In this review we describe the current state-of-the-art in the direct surface analysis of lipids with a focus on tissue sections, intact cells and thin-layer chromatography substrates. The suitability of these different approaches towards analysis of the major lipid classes along with their current and potential applications in the field of lipid analysis are evaluated. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Samples of sea water contain phytoplankton taxa in varying amounts, and marine scientists are interested in the relative abundance of each taxa. Their relative biomass can be ascertained indirectly by measuring the quantity of various pigments using high performance liquid chromatography. However, the conversion from pigment to taxa is mathematically non trivial as it is a positive matrix factorisation problem where both matrices are unknown beyond the level of initial estimates. The prior information on the pigment to taxa conversion matrix is used to give the problem a unique solution. An iteration of two non-negative least squares algorithms gives satisfactory results. Some sample analysis of data indicates prospects for this type of analysis. An alternative more computationally intensive approach using Bayesian methods is discussed.
Resumo:
Recently, mean-variance analysis has been proposed as a novel paradigm to model document ranking in Information Retrieval. The main merit of this approach is that it diversifies the ranking of retrieved documents. In its original formulation, the strategy considers both the mean of relevance estimates of retrieved documents and their variance. How- ever, when this strategy has been empirically instantiated, the concepts of mean and variance are discarded in favour of a point-wise estimation of relevance (to replace the mean) and of a parameter to be tuned or, alternatively, a quantity dependent upon the document length (to replace the variance). In this paper we revisit this ranking strategy by going back to its roots: mean and variance. For each retrieved document, we infer a relevance distribution from a series of point-wise relevance estimations provided by a number of different systems. This is used to compute the mean and the variance of document relevance estimates. On the TREC Clueweb collection, we show that this approach improves the retrieval performances. This development could lead to new strategies to address the fusion of relevance estimates provided by different systems.
Resumo:
The assumptions underlying the Probability Ranking Principle (PRP) have led to a number of alternative approaches that cater or compensate for the PRP’s limitations. All alternatives deviate from the PRP by incorporating dependencies. This results in a re-ranking that promotes or demotes documents depending upon their relationship with the documents that have been already ranked. In this paper, we compare and contrast the behaviour of state-of-the-art ranking strategies and principles. To do so, we tease out analytical relationships between the ranking approaches and we investigate the document kinematics to visualise the effects of the different approaches on document ranking.
Resumo:
In this paper we present truncated differential analysis of reduced-round LBlock by computing the differential distribution of every nibble of the state. LLR statistical test is used as a tool to apply the distinguishing and key-recovery attacks. To build the distinguisher, all possible differences are traced through the cipher and the truncated differential probability distribution is determined for every output nibble. We concatenate additional rounds to the beginning and end of the truncated differential distribution to apply the key-recovery attack. By exploiting properties of the key schedule, we obtain a large overlap of key bits used in the beginning and final rounds. This allows us to significantly increase the differential probabilities and hence reduce the attack complexity. We validate the analysis by implementing the attack on LBlock reduced to 12 rounds. Finally, we apply single-key and related-key attacks on 18 and 21-round LBlock, respectively.
Resumo:
Numerous research studies have evaluated whether distance learning is a viable alternative to traditional learning methods. These studies have generally made use of cross-sectional surveys for collecting data, comparing distance to traditional learners with intent to validate the former as a viable educational tool. Inherent fundamental differences between traditional and distance learning pedagogies, however, reduce the reliability of these comparative studies and constrain the validity of analyses resulting from this analytical approach. This article presents the results of a research project undertaken to analyze expectations and experiences of distance learners with their degree programs. Students were given surveys designed to examine factors expected to affect their overall value assessment of their distance learning program. Multivariate statistical analyses were used to analyze the correlations among variables of interest to support hypothesized relationships among them. Focusing on distance learners overcomes some of the limitations with assessments that compare off- and on-campus student experiences. Evaluation and modeling of distance learner responses on perceived value for money of the distance education they received indicate that the two most important influences are course communication requirements, which had a negative effect, and course logistical simplicity, which revealed a positive effect. Combined, these two factors accounted for approximately 47% of the variability in perceived value for money of the educational program of sampled students. A detailed focus on comparing expectations with outcomes of distance learners complements the existing literature dominated by comparative studies of distance and nondistance learners.
Resumo:
The construction industry has long been burdened with inherent adversarial relationships among the parties and the resulting disputes. Dispute review boards (DRBs) have emerged as alternatives to settle construction-related disputes outside courts. Although DRBs have found support in some quarters of the construction industry, the quantitative assessment of the impact of DRBs has not been adequately addressed. This paper presents the results of a research project undertaken to assess the impact of DRBs on the construction program of a large-scale highway agency. Three dimensions of DRB impact were assessed: (1) influence on project cost and schedule performance, (2) effectiveness of DRBs in preventing and resolving construction disputes, and (3) costs of DRB implementation. The analyses encompass data from approximately 3,000 projects extending over a 10-year period (2000–2009). Quantitative measures of performance were developed and analyzed for each category. Projects that used DRBs faced reduced costs and schedule growth (6.88 and 12.92%, respectively) when compared to non-DRB projects (11.53 and 28.96%). DRBs were also found to be effective in avoiding and settling disputes; the number of arbitration cases reduced consistently after DRB implementation, and DRBs have a success rate of 97% in settling disputes for which DRBs were used. Moreover, costs of DRBs were found to comprise a relatively small fraction (i.e., approximately 0.3%) of total project budgets. It was concluded that DRBs were effective dispute prevention and resolution alternatives with no significant adverse effects on project performance.
Resumo:
Temporary Traffic Control Plans (TCP’s), which provide construction phasing to maintain traffic during construction operations, are integral component of highway construction project design. Using the initial design, designers develop estimated quantities for the required TCP devices that become the basis for bids submitted by highway contractors. However, actual as-built quantities are often significantly different from the engineer’s original estimate. The total cost of TCP phasing on highway construction projects amounts to 6–10% of the total construction cost. Variations between engineer estimated quantities and final quantities contribute to reduced cost control, increased chances of cost related litigations, and bid rankings and selection. Statistical analyses of over 2000 highway construction projects were performed to determine the sources of variation, which later were used as the basis of development for an automated-hybrid prediction model that uses multiple regressions and heuristic rules to provide accurate TCP quantities and costs. The predictive accuracy of the model developed was demonstrated through several case studies.
Resumo:
Studies on quantitative fit analysis of precontoured fracture fixation plates emerged within the last few years and therefore, there is a wide research gap in this area. Quantitative fit assessment facilitates the measure of the gap between a fracture fixation plate and the underlying bone, and specifies the required plate fit criteria. For clinically meaningful fit assessment outcome, it is necessary to establish the appropriate criteria and parameter. The present paper studies this subject and recommends using multiple fit criteria and the maximum distance between the plate and underlying bone as fit parameter for clinically relevant outcome. We also propose the development of a software tool for automatic plate positioning and fit assessment for the purpose of implant design validation and optimization in an effort to provide better fitting implant that can assist proper fracture healing. The fundamental specifications of the software are discussed.