25 resultados para CHEVERUDS CONJECTURE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores a new interpretation of experiments on foil rolling. The assumption that the roll remains convex is relaxed so that the strip profile may become concave, or thicken in the roll gap. However, we conjecture that the concave profile is associated with phenomena which occur after the rolls have stopped. We argue that the yield criterion must be satisfied in a nonconventional manner if such a phenomenon is caused plastically. Finite element analysis on an extrusion problem appears to confirm this conjecture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The results of a numerical investigation into the errors for least squares estimates of function gradients are presented. The underlying algorithm is obtained by constructing a least squares problem using a truncated Taylor expansion. An error bound associated with this method contains in its numerator terms related to the Taylor series remainder, while its denominator contains the smallest singular value of the least squares matrix. Perhaps for this reason the error bounds are often found to be pessimistic by several orders of magnitude. The circumstance under which these poor estimates arise is elucidated and an empirical correction of the theoretical error bounds is conjectured and investigated numerically. This is followed by an indication of how the conjecture is supported by a rigorous argument.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Intention to Notice: the collection, the tour and ordinary landscapes is concerned with how ordinary landscapes and places are enabled and conserved through making itineraries that are framed around the ephemera encountered by chance, and the practices that make possible the endurance of these material traces. Through observing and then examining the material and temporal aspects of a variety of sites/places, the museum and the expanded garden are identified as spaces where the expression of contemporary political, ecological and social attitudes to cultural landscapes can be realised through a curatorial approach to design, to effect minimal intervention. Three notions are proposed to encourage investigation into contemporary cultural landscapes: To traverse slowly to allow space for speculations framed by the topographies and artefacts encountered; to [re]make/[re]write cultural landscapes as discursive landscapes that provoke the intention to notice; and to reveal and conserve the fabric of everyday places. A series of walking, recording and making projects undertaken across a variety of cultural landscapes in remote South Australia, Melbourne, Sydney, London, Los Angeles, Chandigarh, Padova and Istanbul, investigate how communities of practice are facilitated through the invitation to notice and intervene in ordinary landscapes, informed by the theory and practice of postproduction and the reticent auteur. This community of practice approach draws upon chance encounters and it seeks to encourage creative investigation into places. The Intention to Notice is a practice of facilitating that also leads to recording traces and events; large and small, material and immaterial, that encourages both conjecture and archive. Most importantly, there is an open-ended invitation to commit and exchange through design interaction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the current thesis, the reasons for the differential impact of Holocaust trauma on Holocaust survivors, and the differential intergenerational transmission of this trauma to survivors’ children and grandchildren were explored. A model specifically related to Holocaust trauma and its transmission was developed based on trauma, family systems and attachment theories as well as theoretical and anecdotal conjecture in the Holocaust literature. The Model of the Differential Impact of Holocaust Trauma across Three Generations was tested firstly by extensive meta-analyses of the literature pertaining to the psychological health of Holocaust survivors and their descendants and secondly via analysis of empirical study data. The meta-analyses reported in this thesis represent the first conducted with research pertaining to Holocaust survivors and grandchildren of Holocaust survivors. The meta-analysis of research conducted with children of survivors is the first to include both published and unpublished research. Meta-analytic techniques such as meta-regression and sub-set meta-analyses provided new information regarding the influence of a number of unmeasured demographic variables on the psychological health of Holocaust survivors and descendants. Based on the results of the meta-analyses it was concluded that Holocaust survivors and their children and grandchildren suffer from a statistically significantly higher level or greater severity of psychological symptoms than the general population. However it was also concluded that there is statistically significant variation in psychological health within the Holocaust survivor and descendant populations. Demographic variables which may explain a substantial amount of this variation have been largely under-assessed in the literature and so an empirical study was needed to clarify the role of demographics in determining survivor and descendant mental health. A total of 124 participants took part in the empirical study conducted for this thesis with 27 Holocaust survivors, 69 children of survivors and 28 grandchildren of survivors. A worldwide recruitment process was used to obtain these participants. Among the demographic variables assessed in the empirical study, aspects of the survivors’ Holocaust trauma (namely the exact nature of their Holocaust experiences, the extent of family bereavement and their country of origin) were found to be particularly potent predictors of not only their own psychological health but continue to be strongly influential in determining the psychological health of their descendants. Further highlighting the continuing influence of the Holocaust was the finding that number of Holocaust affected ancestors was the strongest demographic predictor of grandchild of survivor psychological health. Apart from demographic variables, the current thesis considered family environment dimensions which have been hypothesised to play a role in the transmission of the traumatic impact of the Holocaust from survivors to their descendants. Within the empirical study, parent-child attachment was found to be a key determinant in the transmission of Holocaust trauma from survivors to their children and insecure parent-child attachment continues to reverberate through the generations. In addition, survivors’ communication about the Holocaust and their Holocaust experiences to their children was found to be more influential than general communication within the family. Ten case studies (derived from the empirical study data set) are also provided; five Holocaust survivors, three children of survivors and two grandchildren of survivors. These cases add further to the picture of heterogeneity of the survivor and descendant populations in both experiences and adaptations. It is concluded that the legacy of the Holocaust continues to leave its mark on both its direct survivors and their descendants. Even two generations removed, the direct and indirect effects of the Holocaust have yet to be completely nullified. Research with Holocaust survivor families serves to highlight the differential impacts of state-based trauma and the ways in which its effects continue to be felt for generations. The revised and empirically tested Model of the Differential Impact of Holocaust Trauma across Three Generations presented at the conclusion of this thesis represents a further clarification of existing trauma theories as well as the first attempt at determining the relative importance of both cognitive, interpersonal/interfamilial interaction processes and demographic variables in post-trauma psychological health and transmission of traumatic impact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been much conjecture of late as to whether the patentable subject matter standard contains a physicality requirement. The issue came to a head when the Federal Circuit introduced the machine-or-transformation test in In re Bilski and declared it to be the sole test for determining subject matter eligibility. Many commentators criticized the test, arguing that it is inconsistent with Supreme Court precedent and the need for the patent system to respond appropriately to all new and useful innovation in whatever form it arises. Those criticisms were vindicated when, on appeal, the Supreme Court in Bilski v. Kappos dispensed with any suggestion that the patentable subject matter test involves a physicality requirement. In this article, the issue is addressed from a normative perspective: it asks whether the patentable subject matter test should contain a physicality requirement. The conclusion reached is that it should not, because such a limitation is not an appropriate means of encouraging much of the valuable innovation we are likely to witness during the Information Age. It is contended that it is not only traditionally-recognized mechanical, chemical and industrial manufacturing processes that are patent eligible, but that patent eligibility extends to include non-machine implemented and non-physical methods that do not have any connection with a physical device and do not cause a physical transformation of matter. Concerns raised that there is a trend of overreaching commoditization or propertization, where the boundaries of patent law have been expanded too far, are unfounded since the strictures of novelty, nonobviousness and sufficiency of description will exclude undeserving subject matter from patentability. The argument made is that introducing a physicality requirement will have unintended adverse effects in various fields of technology, particularly those emerging technologies that are likely to have a profound social effect in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Australia, the extent of a mortgagee’s duty when exercising power of sale has long been the subject of conjecture. With the advent of the global financial crisis in the latter part of 2008, there has been some concern to ensure that the interests of mortgagors are adequately protected. In Queensland, concern of this type resulted in the enactment of the Property Law (Mortgagor Protection) Amendment Act 2008 (Qld). This amending legislation operates to both extend and strengthen the operation of s 85 of the Property Law Act 1974 (Qld) which regulates the mortgagee’s power of sale in Queensland. This article examines the impact of this amending legislation which was hastily introduced and passed by the Queensland Parliament without consultation and which introduces a level of prescription in relation to a sale under a prescribed mortgage which is without precedent elsewhere in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d=VC(F) bound on the graph density of a subgraph of the hypercube—one-inclusion graph. The first main result of this report is a density bound of n∙choose(n-1,≤d-1)/choose(n,≤d) < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d-contractible simplicial complexes, extending the well-known characterization that d=1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VC-dimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Early Years Generalising Project involves Australian students, Years 1-4 (age 5-9), and explores how the students grasp and express generalisations. This paper focuses on the data collected from clinical interviews with Year 3 and 4 cohorts in an investigative study focusing on the identifications, prediction and justification of function rules. It reports on students' attempts to generalise from function machine contexts, describing the various ways students express generalisation and highlighting the different levels of justification given by students. Finally, we conjecture that there are a set of stages in the expression and justification of generalisations that assist students to reach generality within tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The head direction (HD) system in mammals contains neurons that fire to represent the direction the animal is facing in its environment. The ability of these cells to reliably track head direction even after the removal of external sensory cues implies that the HD system is calibrated to function effectively using just internal (proprioceptive and vestibular) inputs. Rat pups and other infant mammals display stereotypical warm-up movements prior to locomotion in novel environments, and similar warm-up movements are seen in adult mammals with certain brain lesion-induced motor impairments. In this study we propose that synaptic learning mechanisms, in conjunction with appropriate movement strategies based on warm-up movements, can calibrate the HD system so that it functions effectively even in darkness. To examine the link between physical embodiment and neural control, and to determine that the system is robust to real-world phenomena, we implemented the synaptic mechanisms in a spiking neural network and tested it on a mobile robot platform. Results show that the combination of the synaptic learning mechanisms and warm-up movements are able to reliably calibrate the HD system so that it accurately tracks real-world head direction, and that calibration breaks down in systematic ways if certain movements are omitted. This work confirms that targeted, embodied behaviour can be used to calibrate neural systems, demonstrates that ‘grounding’ of modeled biological processes in the real world can reveal underlying functional principles (supporting the importance of robotics to biology), and proposes a functional role for stereotypical behaviours seen in infant mammals and those animals with certain motor deficits. We conjecture that these calibration principles may extend to the calibration of other neural systems involved in motion tracking and the representation of space, such as grid cells in entorhinal cortex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial navigation requires the processing of complex, disparate and often ambiguous sensory data. The neurocomputations underpinning this vital ability remain poorly understood. Controversy remains as to whether multimodal sensory information must be combined into a unified representation, consistent with Tolman's "cognitive map", or whether differential activation of independent navigation modules suffice to explain observed navigation behaviour. Here we demonstrate that key neural correlates of spatial navigation in darkness cannot be explained if the path integration system acted independently of boundary (landmark) information. In vivo recordings demonstrate that the rodent head direction (HD) system becomes unstable within three minutes without vision. In contrast, rodents maintain stable place fields and grid fields for over half an hour without vision. Using a simple HD error model, we show analytically that idiothetic path integration (iPI) alone cannot be used to maintain any stable place representation beyond two to three minutes. We then use a measure of place stability based on information theoretic principles to prove that featureless boundaries alone cannot be used to improve localization above chance level. Having shown that neither iPI nor boundaries alone are sufficient, we then address the question of whether their combination is sufficient and - we conjecture - necessary to maintain place stability for prolonged periods without vision. We addressed this question in simulations and robot experiments using a navigation model comprising of a particle filter and boundary map. The model replicates published experimental results on place field and grid field stability without vision, and makes testable predictions including place field splitting and grid field rescaling if the true arena geometry differs from the acquired boundary map. We discuss our findings in light of current theories of animal navigation and neuronal computation, and elaborate on their implications and significance for the design, analysis and interpretation of experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present substantial evidence for the existence of a bias in the distribution of births of leading US politicians in favor of those that have been the oldest in their cohort at school. This “relative age effect” has been proven to influence performance at school and in sports,but evidence on its impact on people’s vocational success has been rare. We find a marked break in the density of birthdate of politicians using a maximum likelihood test and McCrary’s (2008) nonparametric test. We conjecture that being relatively old in a peer group may create long term advantages which can create a significant role in the ability to succeed in a highly competitive environment like the race for top political offices in the USA. The magnitude of the effect we estimate is larger than what most other studies on the relative age effect for a broader (adult) population find, but is in general in line with studies that look at populations in high-competition environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Communication processes are vital in the lifecycle of BPM projects. With this in mind, much research has been performed into facilitating this key component between stakeholders. Amongst the methods used to support this process are personalized process visualisations. In this paper, we review the development of this visualization trend, then, we propose a theoretical analysis framework based upon communication theory. We use this framework to provide theoretical support to the conjecture that 3D virtual worlds are powerful tools for communicating personalised visualisations of processes within a workplace. Meta requirements are then derived and applied, via 3D virtual world functionalities, to generate example visualisations containing personalized aspects, which we believe enhance the process of communcation between analysts and stakeholders in BPM process (re)design activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While social enterprises have gained increasing policy attention as vehicles for generating innovative responses to complex social and environmental problems, surprisingly little is known about them. In particular, the social innovation produced by social enterprises (Mulgan, Tucker, Ali, & Sander, 2007) has been presumed rather than demonstrated, and remains under-investigated in the literature. While social enterprises are held to be inherently innovative as they seek to response to social needs (Nicholls, 2010), there has been conjecture that the collaborative governance arrangements typical in social enterprises may be conducive to innovation (Lumpkin, Moss, Gras, Kato, & Amezcua, In press), as members and volunteers provide a source of creative ideas and are unfettered in such thinking by responsibility to deliver organisational outcomes (Hendry, 2004). However this is complicated by the sheer array of governance arrangements which exist in social enterprises, which range from flat participatory democratic structures through to hierarchical arrangements. In continental Europe, there has been a stronger focus on democratic participation as a characteristic of Social Enterprises than, for example, the USA. In response to this gap in knowledge, a research project was undertaken to identify the population of social enterprises in Australia. The size, composition and the social innovations initiated by these enterprises has been reported elsewhere (see Barraket, 2010). The purpose of this paper is to undertake a closer examination of innovation in social enterprises – particularly how the collaborative governance of social enterprises might influence innovation. Given the pre-paradigmatic state of social entrepreneurship research (Nicholls, 2010), and the importance of drawing draw on established theories in order to advance theory (Short, Moss, & Lumpkin, 2009), a number of conceptual steps are needed in order to examine how collaborative governance might influence by social enterprises. In this paper, we commence by advancing a definition as to what a social enterprise is. In light of our focus on the potential role of collaborative governance in social innovation amongst social enterprises, we go on to consider the collaborative forms of governance prevalent in the Third Sector. Then, collaborative innovation is explored. Drawing on this information and our research data, we finally consider how collaborative governance might affect innovation amongst social enterprises.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in the area of ‘Transformational Government’ position the citizen at the centre of focus. This paradigm shift from a department-centric to a citizen-centric focus requires governments to re-think their approach to service delivery, thereby decreasing costs and increasing citizen satisfaction. The introduction of franchises as a virtual business layer between the departments and their citizens is intended to provide a solution. Franchises are structured to address the needs of citizens independent of internal departmental structures. For delivering services online, governments pursue the development of a One-Stop Portal, which structures information and services through those franchises. Thus, each franchise can be mapped to a specific service bundle, which groups together services that are deemed to be of relevance to a specific citizen need. This study focuses on the development and evaluation of these service bundles. In particular, two research questions guide the line of investigation of this study: Research Question 1): What methods can be used by governments to identify service bundles as part of governmental One-Stop Portals? Research Question 2): How can the quality of service bundles in governmental One-Stop Portals be evaluated? The first research question asks about the identification of suitable service bundle identification methods. A literature review was conducted, to, initially, conceptualise the service bundling task, in general. As a consequence, a 4-layer model of service bundling and a morphological box were created, detailing characteristics that are of relevance when identifying service bundles. Furthermore, a literature review of Decision-Support Systems was conducted to identify approaches of relevance in different bundling scenarios. These initial findings were complemented by targeted studies of multiple leading governments in the e-government domain, as well as with a local expert in the field. Here, the aim was to identify the current status of online service delivery and service bundling in practice. These findings led to the conceptualising of two service bundle identification methods, applicable in the context of Queensland Government: On the one hand, a provider-driven approach, based on service description languages, attributes, and relationships between services was conceptualised. As well, a citizen-driven approach, based on analysing the outcomes from content identification and grouping workshops with citizens, was also conceptualised. Both methods were then applied and evaluated in practice. The conceptualisation of the provider-driven method for service bundling required the initial specification of relevant attributes that could be used to identify similarities between services called relationships; these relationships then formed the basis for the identification of service bundles. This study conceptualised and defined seven relationships, namely ‘Co-location’, ‘Resource’, ‘Co-occurrence’, ‘Event’, ‘Consumer’, ‘Provider’, and ‘Type’. The relationships, and the bundling method itself, were applied and refined as part of six Action Research cycles in collaboration with the Queensland Government. The findings show that attributes and relationships can be used effectively as a means for bundle identification, if distinct decision rules are in place to prescribe how services are to be identified. For the conceptualisation of the citizen-driven method, insights from the case studies led to the decision to involve citizens, through card sorting activities. Based on an initial list of services, relevant for a certain franchise, participating citizens grouped services according to their liking. The card sorting activity, as well as the required analysis and aggregation of the individual card sorting results, was analysed in depth as part of this study. A framework was developed that can be used as a decision-support tool to assist with the decision of what card sorting analysis method should be utilised in a given scenario. The characteristic features associated with card sorting in a government context led to the decision to utilise statistical analysis approaches, such as cluster analysis and factor analysis, to aggregate card sorting results. The second research question asks how the quality of service bundles can be assessed. An extensive literature review was conducted focussing on bundle, portal, and e-service quality. It was found that different studies use different constructs, terminology, and units of analysis, which makes comparing these models a difficult task. As a direct result, a framework was conceptualised, that can be used to position past and future studies in this research domain. Complementing the literature review, interviews conducted as part of the case studies with leaders in e-government, indicated that, typically, satisfaction is evaluated for the overall portal once the portal is online, but quality tests are not conducted during the development phase. Consequently, a research model which appropriately defines perceived service bundle quality would need to be developed from scratch. Based on existing theory, such as Theory of Reasoned Action, Expectation Confirmation Theory, and Theory of Affordances, perceived service bundle quality was defined as an inferential belief. Perceived service bundle quality was positioned within the nomological net of services. Based on the literature analysis on quality, and on the subsequent work of a focus group, the hypothesised antecedents (descriptive beliefs) of the construct and the associated question items were defined and the research model conceptualised. The model was then tested, refined, and finally validated during six Action Research cycles. Results show no significant difference in higher quality or higher satisfaction among users for either the provider-driven method or for the citizen-driven method. The decision on which method to choose, it was found, should be based on contextual factors, such as objectives, resources, and the need for visibility. The constructs of the bundle quality model were examined. While the quality of bundles identified through the citizen-centric approach could be explained through the constructs ‘Navigation’, ‘Ease of Understanding’, and ‘Organisation’, bundles identified through the provider-driven approach could be explained solely through the constructs ‘Navigation’ and ‘Ease of Understanding’. An active labelling style for bundles, as part of the provider-driven Information Architecture, had a larger impact on ‘Quality’ than the topical labelling style used in the citizen-centric Information Architecture. However, ‘Organisation’, reflecting the internal, logical structure of the Information Architecture, was a significant factor impacting on ‘Quality’ only in the citizen-driven Information Architecture. Hence, it was concluded that active labelling can compensate for a lack of logical structure. Further studies are needed to further test this conjecture. Such studies may involve building alternative models and conducting additional empirical research (e.g. use of an active labelling style for the citizen-driven Information Architecture). This thesis contributes to the body of knowledge in several ways. Firstly, it presents an empirically validated model of the factors explaining and predicting a citizen’s perception of service bundle quality. Secondly, it provides two alternative methods that can be used by governments to identify service bundles in structuring the content of a One-Stop Portal. Thirdly, this thesis provides a detailed narrative to suggest how the recent paradigm shift in the public domain, towards a citizen-centric focus, can be pursued by governments; the research methodology followed by this study can serve as an exemplar for governments seeking to achieve a citizen-centric approach to service delivery.