948 resultados para Set of dimensions of fractality


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract-To detect errors in decision tables one needs to decide whether a given set of constraints is feasible or not. This paper describes an algorithm to do so when the constraints are linear in variables that take only integer values. Decision tables with such constraints occur frequently in business data processing and in nonnumeric applications. The aim of the algorithm is to exploit. the abundance of very simple constraints that occur in typical decision table contexts. Essentially, the algorithm is a backtrack procedure where the the solution space is pruned by using the set of simple constrains. After some simplications, the simple constraints are captured in an acyclic directed graph with weighted edges. Further, only those partial vectors are considered from extension which can be extended to assignments that will at least satisfy the simple constraints. This is how pruning of the solution space is achieved. For every partial assignment considered, the graph representation of the simple constraints provides a lower bound for each variable which is not yet assigned a value. These lower bounds play a vital role in the algorithm and they are obtained in an efficient manner by updating older lower bounds. Our present algorithm also incorporates an idea by which it can be checked whether or not an (m - 2)-ary vector can be extended to a solution vector of m components, thereby backtracking is reduced by one component.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The so-called “Scheme of Squares”, displaying an interconnectivity of heterogeneous electron transfer and homogeneous (e.g., proton transfer) reactions, is analysed. Explicit expressions for the various partial currents under potentiostatic conditions are given. The formalism is applicable to several electrode geometries and models (e.g., semi-infinite linear diffusion, rotating disk electrodes, spherical or cylindrical systems) and the analysis is exact. The steady-state (t→∞) expressions for the current are directly given in terms of constant matrices whereas the transients are obtained as Laplace transforms that need to be inverted by approximation of numerical methods. The methodology employs a systems approach which replaces a system of partial differential equations (governing the concentrations of the several electroactive species) by an equivalent set of difference equations obeyed by the various partial currents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research was to develop a set of reliable, valid preparedness metrics, built around a comprehensive framework for assessing hospital preparedness. This research used a combination of qualitative and quantitative methods which included interview and a Delphi study as well as a survey of hospitals in the Sichuan Province of China. The resultant framework is constructed around the stages of disaster management and includes nine key elements. Factor Analysis identified four contributing factors. The comparison of hospitals' preparedness using these four factors, revealed that tertiary-grade, teaching and general hospitals performed better than secondary-grade, non-teaching and non-general hospitals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical methods are often used to analyse commercial catch and effort data to provide standardised fishing effort and/or a relative index of fish abundance for input into stock assessment models. Achieving reliable results has proved difficult in Australia's Northern Prawn Fishery (NPF), due to a combination of such factors as the biological characteristics of the animals, some aspects of the fleet dynamics, and the changes in fishing technology. For this set of data, we compared four modelling approaches (linear models, mixed models, generalised estimating equations, and generalised linear models) with respect to the outcomes of the standardised fishing effort or the relative index of abundance. We also varied the number and form of vessel covariates in the models. Within a subset of data from this fishery, modelling correlation structures did not alter the conclusions from simpler statistical models. The random-effects models also yielded similar results. This is because the estimators are all consistent even if the correlation structure is mis-specified, and the data set is very large. However, the standard errors from different models differed, suggesting that different methods have different statistical efficiency. We suggest that there is value in modelling the variance function and the correlation structure, to make valid and efficient statistical inferences and gain insight into the data. We found that fishing power was separable from the indices of prawn abundance only when we offset the impact of vessel characteristics at assumed values from external sources. This may be due to the large degree of confounding within the data, and the extreme temporal changes in certain aspects of individual vessels, the fleet and the fleet dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is a theoretical study of finite-state based grammars used in natural language processing. The study is concerned with certain varieties of finite-state intersection grammars (FSIG) whose parsers define regular relations between surface strings and annotated surface strings. The study focuses on the following three aspects of FSIGs: (i) Computational complexity of grammars under limiting parameters In the study, the computational complexity in practical natural language processing is approached through performance-motivated parameters on structural complexity. Each parameter splits some grammars in the Chomsky hierarchy into an infinite set of subset approximations. When the approximations are regular, they seem to fall into the logarithmic-time hierarchyand the dot-depth hierarchy of star-free regular languages. This theoretical result is important and possibly relevant to grammar induction. (ii) Linguistically applicable structural representations Related to the linguistically applicable representations of syntactic entities, the study contains new bracketing schemes that cope with dependency links, left- and right branching, crossing dependencies and spurious ambiguity. New grammar representations that resemble the Chomsky-Schützenberger representation of context-free languages are presented in the study, and they include, in particular, representations for mildly context-sensitive non-projective dependency grammars whose performance-motivated approximations are linear time parseable. (iii) Compilation and simplification of linguistic constraints Efficient compilation methods for certain regular operations such as generalized restriction are presented. These include an elegant algorithm that has already been adopted as the approach in a proprietary finite-state tool. In addition to the compilation methods, an approach to on-the-fly simplifications of finite-state representations for parse forests is sketched. These findings are tightly coupled with each other under the theme of locality. I argue that the findings help us to develop better, linguistically oriented formalisms for finite-state parsing and to develop more efficient parsers for natural language processing. Avainsanat: syntactic parsing, finite-state automata, dependency grammar, first-order logic, linguistic performance, star-free regular approximations, mildly context-sensitive grammars

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this dissertation, I present an overall methodological framework for studying linguistic alternations, focusing specifically on lexical variation in denoting a single meaning, that is, synonymy. As the practical example, I employ the synonymous set of the four most common Finnish verbs denoting THINK, namely ajatella, miettiä, pohtia and harkita ‘think, reflect, ponder, consider’. As a continuation to previous work, I describe in considerable detail the extension of statistical methods from dichotomous linguistic settings (e.g., Gries 2003; Bresnan et al. 2007) to polytomous ones, that is, concerning more than two possible alternative outcomes. The applied statistical methods are arranged into a succession of stages with increasing complexity, proceeding from univariate via bivariate to multivariate techniques in the end. As the central multivariate method, I argue for the use of polytomous logistic regression and demonstrate its practical implementation to the studied phenomenon, thus extending the work by Bresnan et al. (2007), who applied simple (binary) logistic regression to a dichotomous structural alternation in English. The results of the various statistical analyses confirm that a wide range of contextual features across different categories are indeed associated with the use and selection of the selected think lexemes; however, a substantial part of these features are not exemplified in current Finnish lexicographical descriptions. The multivariate analysis results indicate that the semantic classifications of syntactic argument types are on the average the most distinctive feature category, followed by overall semantic characterizations of the verb chains, and then syntactic argument types alone, with morphological features pertaining to the verb chain and extra-linguistic features relegated to the last position. In terms of overall performance of the multivariate analysis and modeling, the prediction accuracy seems to reach a ceiling at a Recall rate of roughly two-thirds of the sentences in the research corpus. The analysis of these results suggests a limit to what can be explained and determined within the immediate sentential context and applying the conventional descriptive and analytical apparatus based on currently available linguistic theories and models. The results also support Bresnan’s (2007) and others’ (e.g., Bod et al. 2003) probabilistic view of the relationship between linguistic usage and the underlying linguistic system, in which only a minority of linguistic choices are categorical, given the known context – represented as a feature cluster – that can be analytically grasped and identified. Instead, most contexts exhibit degrees of variation as to their outcomes, resulting in proportionate choices over longer stretches of usage in texts or speech.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The properties of the manifold of a Lie groupG, fibered by the cosets of a sub-groupH, are exploited to obtain a geometrical description of gauge theories in space-timeG/H. Gauge potentials and matter fields are pullbacks of equivariant fields onG. Our concept of a connection is more restricted than that in the similar scheme of Ne'eman and Regge, so that its degrees of freedom are just those of a set of gauge potentials forG, onG/H, with no redundant components. The ldquotranslationalrdquo gauge potentials give rise in a natural way to a nonsingular tetrad onG/H. The underlying groupG to be gauged is the groupG of left translations on the manifoldG and is associated with a ldquotrivialrdquo connection, namely the Maurer-Cartan form. Gauge transformations are all those diffeomorphisms onG that preserve the fiber-bundle structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is a synchronic description of the phonology and grammar of two dialects of the Rajbanshi language (Eastern Indo-Aryan) as spoken in Jhapa, Nepal. I have primarily confined the analysis to the oral expression, since the emerging literary form is still in its infancy. The grammatical analysis is therefore based, for the most part, on a corpus of oral narrative text which was recorded and transcribed from three informants from north-east Jhapa. An informant, speaking a dialect from south-west Jhapa cross checked this text corpus and provided additional elicited material. I have described the phonology, morphology and syntax of the language, and also one aspect of its discourse structure. For the most part the phonology follows the basic Indo-Aryan pattern. Derivational morphology, compounding, reduplication, echo formation and onomatopoeic constructions are considered, as well as number, noun classes (their assignment and grammatical function), pronouns, and case and postpositions. In verbal morphology I cover causative stems, the copula, primary and secondary agreement, tense, aspect, mood, auxiliary constructions and non-finite forms. The term secondary agreement here refers to genitive agreement, dative-subject agreement and patient (and sometimes patient-agent) agreement. The breaking of default agreement rules has a range of pragmatic inferences. I argue that a distinction, based on formal, semantic and statistical grounds, should be made between conjunct verbs, derivational compound verbs and quasi-aspectual compound verbs. Rajbanshi has an open set of adjectives, and it additionally makes use of a restricted set of nouns which can function as adjectives. Various particles, and the emphatic and conjunctive clitics are also considered. The syntactic structures studied include: non-declarative speech acts, phrase-internal and clause-internal constituent order, negation, subordination, coordination and valence adjustment. I explain how the future, present and past tenses in Rajbanshi oral narratives do not seem to maintain a time reference, but rather to indicate a distinction between background and foreground information. I call this tense neutralisation .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines strategies used to translate various thematic and character delineating allusions in two of Reginald Hill's detective novels, The Wood Beyond and On Beulah Height and their Swedish translations Det mörka arvet and Dalen som dränktes. In this study, thematic allusions and allusions used in character delineation are regarded as intertextual networks. Intertextual networks comprise all the texts that are in one way or another embedded into a text, all the texts referred to in it and even the texts somehow rejected from a text's own canon. Studying allusions as intertextual networks makes it warranted to pay minute attention to even the smallest of details. Seen together, these little details form extensive networks of meaning that readers use to interpret the text. Allusion can be defined as a reference, often covert or indirect, to another text in a way that brings into the text some of the associations of that other text. A text is here understood broadly, hence sources of allusions include all cultural texts from literature and history to cinema and televisions serials. Allusions are culture bound and each culture tends to allude to its own cultural products. The set of transcultural allusions is therefore fairly small. Translation strategies are translatorial ways of solving translation problems. Being culture-bound, allusions are potential translation problems. In order to transmit the thoughts evoked by the allusions in source text readers to the target text readers translators may add guidance to the translated text. Often guidance is not added, which may result in changes in handling of themes or character delineation, clear in the source text but confusing or incomprehensible in the target text. However, norms in target culture may not always allow the translators the possibility to make the text comprehensible. My analyses of translation strategies show that in the two translated novels studied minimum change is a very frequently used strategy. This results in themes and character delineation losing some of the effect they have in the source texts. Perhaps surprisingly, the result is very much the same even where it is possible to discern that the two translators have had differing translation principles. Keywords: allusions, intertextuality, literary translation, translation strategies, norms, crime fiction, Hill, Reginald

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study sets out to provide new information about the interaction between abstract religious ideas and actual acts of violence in the early crusading movement. The sources are asked, whether such a concept as religious violence can be sorted out as an independent or distinguishable source of aggression at the moment of actual bloodshed. The analysis concentrates on the practitioners of sacred violence, crusaders and their mental processing of the use of violence, the concept of the violent act, and the set of values and attitudes defining this concept. The scope of the study, the early crusade movement, covers the period from late 1080 s to the crusader conquest of Jerusalem in 15 July 1099. The research has been carried out by contextual reading of relevant sources. Eyewitness reports will be compared with texts that were produced by ecclesiastics in Europe. Critical reading of the texts reveals both connecting ideas and interesting differences between them. The sources share a positive attitude towards crusading, and have principally been written to propagate the crusade institution and find new recruits. The emphasis of the study is on the interpretation of images: the sources are not asked what really happened in chronological order, but what the crusader understanding of the reality was like. Fictional material can be even more crucial for the understanding of the crusading mentality. Crusader sources from around the turn of the twelfth century accept violent encounters with non-Christians on the grounds of external hostility directed towards the Christian community. The enemies of Christendom can be identified with either non-Christians living outside the Christian society (Muslims), non-Christians living within the Christian society (Jews) or Christian heretics. Western Christians are described as both victims and avengers of the surrounding forces of diabolical evil. Although the ideal of universal Christianity and gradual eradication of the non-Christian is present, the practical means of achieving a united Christendom are not discussed. The objective of crusader violence was thus entirely Christian: the punishment of the wicked and the restoration of Christian morals and the divine order. Meanwhile, the means used to achieve these objectives were not. Given the scarcity of written regulations concerning the use of force in bello, perceptions concerning the practical use of violence were drawn from a multitude of notions comprising an adaptable network of secular and ecclesiastical, pre-Christian and Christian traditions. Though essentially ideological and often religious in character, the early crusader concept of the practise of violence was not exclusively rooted in Christian thought. The main conclusion of the study is that there existed a definable crusader ideology of the use of force by 1100. The crusader image of violence involved several levels of thought. Predominantly, violence indicates a means of achieving higher spiritual rewards; eternal salvation and immortal glory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fluidised bed-heat pump drying technology offers distinctive advantages over the existing drying technology employed in the Australian food industry. However, as is the case with many other examples of innovations that have had clear relative advantages, the rates of adoption and diffusion of this technology have been very slow. "Why does this happen?" is the theme of this research study that has been undertaken with an objective to analyse a range of issues related to the market acceptance of technological innovations. The research methodology included the development of an integrated conceptual model based on an extensive review of literature in the areas of innovation diffusion, technology transfer and industrial marketing. Three major determinants associated with the market acceptance of innovations were identified as the characteristics of the innovation, adopter information processing capability and the influence of the innovation supplier on the adoption process. This was followed by a study involving more than 30 small and medium enterprises identified as potential adopters of fluidised bed-heat pump drying technology in the Australian food industry. The findings revealed that judgment was the key evaluation strategy employed by potential adopters in the particular industry sector. Further, it was evidenced that the innovations were evaluated against a predetermined criteria covering a range of aspects with emphasis on a selected set of attributes of the innovation. Implication of these findings on the commercialisation of fluidised bed-heat pump drying technology was established, and a series of recommendations was made to the innovation supplier (DPI/FT) enabling it to develop an effective commercialisation strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An on-line algorithm is developed for the location of single cross point faults in a PLA (FPLA). The main feature of the algorithm is the determination of a fault set corresponding to the response obtained for a failed test. For the apparently small number of faults in this set, all other tests are generated and a fault table is formed. Subsequently, an adaptive procedure is used to diagnose the fault. Functional equivalence test is carried out to determine the actual fault class if the adaptive testing results in a set of faults with identical tests. The large amount of computation time and storage required in the determination, a priori, of all the fault equivalence classes or in the construction of a fault dictionary are not needed here. A brief study of functional equivalence among the cross point faults is also made.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the validation of a manoeuvring model for a novel 127m-vehicle-passenger trimaran via full scale trials. The adopted structure of the model is based on a model previously proposed in the literature with some simplifications. The structure of the model is discussed. Then initial parameter estimates are computed, and the final set of parameters are obtained via adjustments based on engineering judgement and application of a genetic algorithm so as to match the data of the trials. The validity of the model is also assessed with data from a trial different from the one use for the parameter adjustment. The model shows good agreement with the trial data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Prescribing is a complex task, requiring specific knowledge and skills, and the execution of effective, context-specific clinical reasoning. Systematic reviews indicate medical prescribing errors have a median rate of 7% [IQR 2%-14%] of medication orders [1-3]. For podiatrists pursuing prescribing rights, a clear need exists to ensure practitioners develop a well-defined set of prescribing skills, which will contribute to competent, safe and appropriate practice. Aim To investigate the methods employed to teach and assess the principles of effective prescribing in the undergraduate podiatry program and compare and contrast these findings with four other non-medical professions who undertake prescribing after training at Queensland University of Technology. Method The NPS National Prescribing Competency Standards were employed as the prescribing standard. A curriculum mapping exercise was undertaken to determine whether the prescribing principles articulated in the competency standards were addressed by each profession. Results A range of methods are currently utilised to teach prescribing across disciplines. Application of prescribing competencies to the context of each profession appears to influence the teaching methods used. Most competencies were taught using a multimodal format, including interactive lectures, self-directed learning, tutorial sessions and clinical placement. In particular clinical training was identified as the most consistent form of educating safe prescribers across all five disciplines. Assessment of prescribing competency utilised multiple techniques including written and oral examinations and research tasks, case studies, objective structured clinical examination exercises and the assessment of clinical practice. Effective and reliable assessment of prescribing undertaken by students in diverse settings remains challenging e.g. that occurring in the clinical practice environment. Conclusion Recommendations were made to refine curricula and to promote efficient cross-discipline teaching by staff from the disciplines of podiatry, pharmacy, nurse practitioner, optometry and paramedic science. Students now experience a sophisticated level of multidisciplinary learning in the clinical setting which integrates the expertise and skills of experience prescribers combined with innovative information technology platforms (CCTV and live patient assessments). Further work is required to establish a practical, effective approach to the assessment of prescribing competence especially between the university and clinical settings.