908 resultados para average complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Underlying all assessments are human judgements regarding the quality of students’ understandings. Despite their ubiquity, those judgements are conceptually elusive. The articles selected for inclusion in this issue explore the complexity of judgement practice raising critical questions that challenge existing views and accepted policy and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Average speed enforcement is a relatively new approach gaining popularity throughout Europe and Australia. This paper reviews the evidence regarding the impact of this approach on vehicle speeds, crashes rates and a number of additional road safety and public health outcomes. The economic and practical viability of the approach as a road safety countermeasure is also explored. A literature review, with an international scope, of both published and grey literature was conducted. There is a growing body of evidence to suggest a number of road safety benefits associated with average speed enforcement, including high rates of compliance with speed limits, reductions in average and 85th percentile speeds and reduced speed variability between vehicles. Moreover, the approach has been demonstrated to be particularly effective in reducing excessive speeding behaviour. Reductions in crash rates have also been reported in association with average speed enforcement, particularly in relation to fatal and serious injury crashes. In addition, the approach has been shown to improve traffic flow, reduce vehicle emissions and has also been associated with high levels of public acceptance. Average speed enforcement offers a greater network-wide approach to managing speeds that reduces the impact of time and distance halo effects associated with other automated speed enforcement approaches. Although comparatively expensive it represents a highly reliable approach to speed enforcement that produces considerable returns on investment through reduced social and economic costs associated with crashes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology for real-time estimation of exit movement-specific average travel time on urban routes by integrating real-time cumulative plots, probe vehicles, and historic cumulative plots. Two approaches, component based and extreme based, are discussed for route travel time estimation. The methodology is tested with simulation and is validated with real data from Lucerne, Switzerland, that demonstrate its potential for accurate estimation. Both approaches provide similar results. The component-based approach is more reliable, with a greater chance of obtaining a probe vehicle in each interval, although additional data from each component is required. The extreme-based approach is simple and requires only data from upstream and downstream of the route, but the chances of obtaining a probe that traverses the entire route might be low. The performance of the methodology is also compared with a probe-only method. The proposed methodology requires only a few probes for accurate estimation; the probe-only method requires significantly more probes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automated process discovery techniques aim at extracting models from information system logs in order to shed light into the business processes supported by these systems. Existing techniques in this space are effective when applied to relatively small or regular logs, but otherwise generate large and spaghetti-like models. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. The result is a collection of process models -- each one representing a variant of the business process -- as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically by means of subprocess extraction. The proposed technique allows users to set a desired bound for the complexity of the produced models. Experiments on real-life logs show that the technique produces collections of models that are up to 64% smaller than those extracted under the same complexity bounds by applying existing trace clustering techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of maximizing the secure connectivity in wireless ad hoc networks, and analyze complexity of the post-deployment key establishment process constrained by physical layer properties such as connectivity, energy consumption and interference. Two approaches, based on graph augmentation problems with nonlinear edge costs, are formulated. The first one is based on establishing a secret key using only the links that are already secured by shared keys. This problem is in NP-hard and does not accept polynomial time approximation scheme PTAS since minimum cutsets to be augmented do not admit constant costs. The second one extends the first problem by increasing the power level between a pair of nodes that has a secret key to enable them physically connect. This problem can be formulated as the optimal key establishment problem with interference constraints with bi-objectives: (i) maximizing the concurrent key establishment flow, (ii) minimizing the cost. We prove that both problems are NP-hard and MAX-SNP with a reduction to MAX3SAT problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, a tandem LC-MS (Waters Xevo TQ) MRM-based MS method was developed for rapid, broad profiling of hydrophilic metabolites from biological samples, in either positive or negative ion modes without the need for an ion pairing reagent, using a reversed-phase pentafluorophenylpropyl (PFPP) column. The developed method was successfully applied to analyze various biological samples from C57BL/6 mice, including urine, duodenum, liver, plasma, kidney, heart, and skeletal muscle. As result, a total 112 of hydrophilic metabolites were detected within 8 min of running time to obtain a metabolite profile of the biological samples. The analysis of this number of hydrophilic metabolites is significantly faster than previous studies. Classification separation for metabolites from different tissues was globally analyzed by PCA, PLS-DA and HCA biostatistical methods. Overall, most of the hydrophilic metabolites were found to have a "fingerprint" characteristic of tissue dependency. In general, a higher level of most metabolites was found in urine, duodenum, and kidney. Altogether, these results suggest that this method has potential application for targeted metabolomic analyzes of hydrophilic metabolites in a wide ranges of biological samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prior to the completion of the human genome project, the human genome was thought to have a greater number of genes as it seemed structurally and functionally more complex than other simpler organisms. This along with the belief of “one gene, one protein”, were demonstrated to be incorrect. The inequality in the ratio of gene to protein formation gave rise to the theory of alternative splicing (AS). AS is a mechanism by which one gene gives rise to multiple protein products. Numerous databases and online bioinformatic tools are available for the detection and analysis of AS. Bioinformatics provides an important approach to study mRNA and protein diversity by various tools such as expressed sequence tag (EST) sequences obtained from completely processed mRNA. Microarrays and deep sequencing approaches also aid in the detection of splicing events. Initially it was postulated that AS occurred only in about 5%; of all genes but was later found to be more abundant. Using bioinformatic approaches, the level of AS in human genes was found to be fairly high with 35-59%; of genes having at least one AS form. Our ability to determine and predict AS is important as disorders in splicing patterns may lead to abnormal splice variants resulting in genetic diseases. In addition, the diversity of proteins produced by AS poses a challenge for successful drug discovery and therefore a greater understanding of AS would be beneficial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Project Management (PM) as an academic field is relatively new in Australian universities. Moreover, the field is distributed across four main areas: business (management), built environment and construction, engineering and more recently ICT (information systems). At an institutional level, with notable exceptions, there is little engagement between researchers working in those individual areas. Consequently, an initiative was launched in 2009 to create a network of PM researchers to build a disciplinary base for PM in Australia. The initiative took the form of a bi-annual forum. The first forum established the constituency and spread of PM research in Australia (Sense et al., 2011). This special issue of IJPM arose out of the second forum, held in 2012, that explored the notion of an Australian perspective on PM. At the forum, researchers were invited to collaborate to explore issues, methodological approaches, and theoretical positions underpinning their research and to answer the question: is there a distinctly Australian research agenda which responds to the current challenges of large and complex projects in our region? From a research point of view, it was abundantly clear at the forum that many of the issues facing Australian researchers are shared around the world. However, what emerged from the forum as the Australian perspective was a set of themes and research issues that dominate the Australia research agenda.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With unpredictable workloads and a need for a multitude of specialized skills, many main contractors rely heavily on subcontracting to reduce their risks (Bresnen et al., 1985; Beardsworth et al., 1988). This is especially the case In Hong Kong, where the average direct labour content accounts for only around 1% of the total contract sum (Lai, 1987). Extensive usage of subcontracting is also reported in many other countries, including the UK (Gray and Flanagan, 1989) and Japan (Bennett et al., 1987). In addition, and depending upon the scale and complexity of works, it is not uncommon for subcontractors to further sublet their works to lower tier(s) subcontractors. Richter and Mitchell (1982) argued that main contractors can obtain a higher profit margin by reducing their performance costs by subcontracting work to those who have the necessary resources to perform the work more efficiently and economically. Subcontracting is also used strategically to allow firms to employ a minimum work force under fluctuating demand (Usdiken and Sözen, 1985). Through subcontracting, the risks of main contractors are also reduced, as errors in estimating or additional costs caused by delays or extra labour requirements can be absorbed by the subcontractors involved (Woon and Ofori, 2000). Despite these benefits, the quality of work can suffer when incapable or inexperienced subcontractors are employed. Additional problems also exist in the form of bid shopping, unclear accountability, and high fragmentation (Palaneeswaran et al., 2002). A recent CIB TG 23 International Conference, October 2003, Hong Kong report produced by the Hong Kong Construction Industry Review Committee (CIRC) points to development of a framework to help distinguish between capable and incapable subcontractors (Tang, 2001). This paper describes research aims at identifying and prioritising criteria for use in such a framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lens average and equivalent refractive indices are required for purposes such as lens thickness estimation and optical modeling. We modeled the refractive index gradient as a power function of the normalized distance from lens center. Average index along the lens axis was estimated by integration. Equivalent index was estimated by raytracing through a model eye to establish ocular refraction, and then backward raytracing to determine the constant refractive index yielding the same refraction. Assuming center and edge indices remained constant with age, at 1.415 and 1.37 respectively, average axial refractive index increased (1.408 to 1.411) and equivalent index decreased (1.425 to 1.420) with age increase from 20 to 70 years. These values agree well with experimental estimates based on different techniques, although the latter show considerable scatter. The simple model of index gradient gives reasonable estimates of average and equivalent lens indices, although refinements in modeling and measurements are required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a quasi-natural voting experiment encompassing a 160-year period (1848–2009) in Switzerland, we investigate whether a higher level of complexity leads to increased reliance on trusted parliamentary representatives. We find that when more referenda are held on the same day, constituents are more likely to refer to parliamentary recommendations when making their decisions. This finding holds true even when we narrow our focus to referenda with a relatively lower voter turnout on days on which more than one referendum is held. We also demonstrate that when constituents face a higher level of complexity, they follow the parliamentary recommendations rather than those of interest groups. "Viewed as a geometric figure, the ant’s path is irregular, complex, hard to describe. But its complexity is really a complexity in the surface of the beach, not a complexity in the ant." ([1] p. 51)