910 resultados para Dimension stones


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermogravimetry combined with evolved gas mass spectrometry has been used to ascertain the stability of the ‘cave’ mineral brushite. X-ray diffraction shows that brushite from the Jenolan Caves is very pure. Thermogravimetric analysis coupled with ion current mass spectrometry shows a mass loss at 111°C due to loss of water of hydration. A further decomposition step occurs at 190°C with the conversion of hydrogen phosphate to a mixture of calcium ortho-phosphate and calcium pyrophosphate. TG-DTG shows the mineral is not stable above 111°C. A mechanism for the formation of brushite on calcite surfaces is proposed, and this mechanism has relevance to the formation of brushite in urinary tracts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The creative work of this study is a novel-length work of literary fiction called Keeping House (published as Grace's Table, by University of Queensland Press, April 2014). Grace has not had twelve people at her table for a long time. Hers isn't the kind of family who share regular Sunday meals. As Grace prepares the feast, she reflects on her life, her marriage and her friendships. When the three generations of her family come together, simmering tensions from the past threaten to boil over. The one thing that no one can talk about is the one thing that no one can forget. Grace's Table is a moving and often funny novel using food as a language to explore the power of memory and the family rituals that define us. The exegetical component of this study does not adhere to traditional research pedagogies. Instead, it follows the model of what the literature describes as fictocriticism. It is the intention that the exegesis be read as a hybrid genre; one that combines creative practice and theory and blurs the boundaries between philosophy and fiction. In offering itself as an alternative to the exegetical canon it provides a model for the multiplicity of knowledge production suited to the discipline of practice-led research. The exegesis mirrors structural elements of the creative work by inviting twelve guests into the domestic space of the novel to share a meal. The guests, chosen for their diverse thinking, enable examination of the various agents of power involved in the delivery of food. Their ideas cross genders, ages and time periods; their motivations and opinions often collide. Some are more concerned with the spatial politics of where food is consumed, others with its actual preparation and consumption. Each, however, provides a series of creative reflective conversations throughout the meal which help to answer the research question: How can disempowered women take authority within their domestic space? Michel de Certeau must defend his "operational tactics" or "art of the weak" 1 as a means by which women can subvert the colonisation of their domestic space against Michel Foucault's ideas about the functions of a "disciplinary apparatus". 2 Erving Goffman argues that the success of de Certeau's "tactics" depends upon his theories of "performance" and "masquerade" 3; a claim de Certeau refutes. Doreen Massey and the author combine forces in arguing for space, time and politics to be seen as interconnected, non-static and often contested. The author calls for identity, or sense of self, to be considered a further dimension which impacts on the function of spatial models. Yu-Fi Tuan speaks of the intimacy of kitchens; Gaston Bachelard the power of daydreams; and Jean Anthelme Brillat-Savarin gives the reader a taste of the nourishing arts. Roland Barthes forces the author to reconsider her function as a writer and her understanding of the reader's relationship with a text. Fictional characters from two texts have a place at the table – Marian from The Edible Woman by Margaret Atwood 4 and Lilian from Lilian's Story by Kate Grenville. 5 Each explores how they successfully subverted expectations of their gender. The author interprets and applies elements of the conversations to support Grace's tactics in the novel as well as those related to her own creative research practice. Grace serves her guests, reflecting on what is said and how it relates to her story. Over coffee, the two come together to examine what each has learned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Website customization can help to better fulfill the needs and wants of individual customers. It is an important aspect of customer satisfaction of online banking, especially among the younger generation. This dimension, however, is poorly addressed particularly in the Australian context. The proposed research aims to fulfill this gap by exploring the use of a popular Web 2.0 technology known as tags or user assigned metadata to facilitate customization at the interaction level. A prototype is proposed to demonstrate the various interaction-based customization types, evaluated through a series of experiments to assess the impact on customer satisfaction. The expected research outcome is a set of guidelines akin to interaction design patterns for aiding the design and implementation of the proposed tag-based approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examined properties of culture-level personality traits in ratings of targets (N=5,109) ages 12 to 17 in 24 cultures. Aggregate scores were generalizable across gender, age, and relationship groups and showed convergence with culture-level scores from previous studies of self-reports and observer ratings of adults, but they were unrelated to national character stereotypes. Trait profiles also showed cross-study agreement within most cultures, 8 of which had not previously been studied. Multidimensional scaling showed that Western and non-Western cultures clustered along a dimension related to Extraversion. A culture-level factor analysis replicated earlier findings of a broad Extraversion factor but generally resembled the factor structure found in individuals. Continued analysis of aggregate personality scores is warranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This panel discusses the impact of Green IT on information systems and how information systems can meet environmental challenges and ensure sustainability. We wish to highlight the role of green business processes, and specifically the contributions that the management of these processes can play in leveraging the transformative power of IS in order to create an environmentally sustainable society. The management of business processes has typically been thought of in terms of business improvement alongside the dimensions time, cost, quality, or flexibility – the so-called ‘devil’s quadrangle’. Contemporary organizations, however, increasingly become aware of the need to create more sustainable, IT-enabled business processes that are also successful in terms of their economic, ecological, as well as social impact. Exemplary ecological key performance indicators that increasingly find their way into the agenda of managers include carbon emissions, data center energy, or renewable energy consumption (SAP 2010). The key challenge, therefore, is to extend the devil’s quadrangle to a devil’s pentagon, including sustainability as an important fifth dimension in process change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of knowing’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The major purpose of Vehicular Ad Hoc Networks (VANETs) is to provide safety-related message access for motorists to react or make a life-critical decision for road safety enhancement. Accessing safety-related information through the use of VANET communications, therefore, must be protected, as motorists may make critical decisions in response to emergency situations in VANETs. If introducing security services into VANETs causes considerable transmission latency or processing delays, this would defeat the purpose of using VANETs to improve road safety. Current research in secure messaging for VANETs appears to focus on employing certificate-based Public Key Cryptosystem (PKC) to support security. The security overhead of such a scheme, however, creates a transmission delay and introduces a time-consuming verification process to VANET communications. This paper proposes an efficient public key management system for VANETs: the Public Key Registry (PKR) system. Not only does this paper demonstrate that the proposed PKR system can maintain security, but it also asserts that it can improve overall performance and scalability at a lower cost, compared to the certificate-based PKC scheme. It is believed that the proposed PKR system will create a new dimension to the key management and verification services for VANETs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces the underlying design concepts of I8DAT, a food image sharing application that has been developed as part of a three-year research project – Eat, Cook, Grow: Ubiquitous Technology for Sustainable Food Culture in the City (http://www.urbaninformatics .net/projects/food) – exploring urban food practices to engage people in healthier, more environmentally and socially sustainable eating, cooking, and growing food in their everyday lives. The key aim of the project is to produce actionable knowledge, which is then applied to create and test several accessible, user-centred interactive design solutions that motivate user-engagement through playful and social means rather than authoritative information distribution. Through the design and implementation processes we envisage to integrate these design interventions to create a sustainable food network that is both technical and socio-cultural in nature (technosocial). Our primary research locale is Brisbane, Australia, with additional work carried out in three reference cities with divergent geographic, socio-cultural, and technological backgrounds: Seoul, South Korea, for its global leadership in ubiquitous technology, broadband access, and high population density; Lincoln, UK, for the regional and peri-urban dimension it provides, and Portland, Oregon, US, for its international standing as a hub of the sustainable food movement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter I look at some issues around the transfer of cultural industry policy between two very different national contexts, the UK and Russia. Specifically it draws on a partnership project between Manchester and St. Petersburg financed by the European Union as part of a program to promote economic development through knowledge transfer between Europe and the countries of the former Soviet Union. This specific project attempted to place the cultural industries squarely within the dimension of economic development, and drew on the expertise of Manchester’s Creative Industries Development Service and other partners to effect this policy transfer

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a two-layer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a "large margin." The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d=VC(F) bound on the graph density of a subgraph of the hypercube—one-inclusion graph. The first main result of this report is a density bound of n∙choose(n-1,≤d-1)/choose(n,≤d) < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d-contractible simplicial complexes, extending the well-known characterization that d=1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VC-dimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout