148 resultados para Form finding
Resumo:
Pressure-sensitive adhesives (PSAs) have applications in the fields of packaging, joining, wound care, and personal care. Depending on the application of the PSA, different performance tests are carried out when new products are developed or the quality of the existing products is checked. Tack is the property of an adhesive that enables it to form instant bond on the surface under light pressure. The tack of a PSA strongly depends on the way the bond is created. Parameters such as the bonded area, contact time and the nature of tack materials all affect the tack force measured. In the development of any PSA, it is desirable to correlate the performance related properties such as tack and peel strength to the rheological behaviour. Finding these correlations would make it possible to evaluate the performance of a PSA using its rheological characteristics. In this investigation we have studied the influence of rheological behaviour of three different PSAs on their tackiness. The three different PSAs used in this study are a low molecular weight rosin ester, high molecular weight rosin ester, and dicyclopentadiene. Various rheological properties such as viscosity, phase angle, and elastic and viscous moduli are measured versus the frequency and temperature. Also the tack properties at various removal speeds and temperatures are evaluated. Analysis of the results indicates different performances of the three PSAs which could be related to their rheological properties, especially the phase angle, at different frequencies and temperatures. The PSA with high molecular weight rosin ester is more sensitive to temperature changes and showed drastic changes in tackiness from high temperature to low temperature. On the other hand, rosin ester with low molecular weight is less sensitive to temperature changes. © 2010 VSP.
Resumo:
An intriguing feature of mitochondrial complex I from several species is the so-called A/D transition, whereby the idle enzyme spontaneously converts from the active (A) form to the de-active (D) form. The A/D transition plays an important role in tissue response to the lack of oxygen and hypoxic deactivation of the enzyme is one of the key regulatory events that occur in mitochondria during ischaemia. We demonstrate for the first time that the A/D conformational change of complex I does not affect the macromolecular organisation of supercomplexes in vitro as revealed by two types of native electrophoresis. Cysteine 39 of the mitochondrially-encoded ND3 subunit is known to become exposed upon de-activation. Here we show that even if complex I is a constituent of the I + III + IV (S) supercomplex, cysteine 39 is accessible for chemical modification in only the D-form. Using lysine-specific fluorescent labelling and a DIGE-like approach we further identified two new subunits involved in structural rearrangements during the A/D transition: ND1 (MT-ND1) and 39 kDa (NDUFA9). These results clearly show that structural rearrangements during de-activation of complex I include several subunits located at the junction between hydrophilic and hydrophobic domains, in the region of the quinone binding site. De-activation of mitochondrial complex I results in concerted structural rearrangement of membrane subunits which leads to the disruption of the sealed quinone chamber required for catalytic turnover.
Resumo:
Purpose
– Traditionally, most studies focus on institutionalized management-driven actors to understand technology management innovation. The purpose of this paper is to argue that there is a need for research to study the nature and role of dissident non-institutionalized actors’ (i.e. outsourced web designers and rapid application software developers). The authors propose that through online social knowledge sharing, non-institutionalized actors’ solution-finding tensions enable technology management innovation.
Design/methodology/approach
– A synthesis of the literature and an analysis of the data (21 interviews) provided insights in three areas of solution-finding tensions enabling management innovation. The authors frame the analysis on the peripherally deviant work and the nature of the ways that dissident non-institutionalized actors deviate from their clients (understood as the firm) original contracted objectives.
Findings
– The findings provide insights into the productive role of solution-finding tensions in enabling opportunities for management service innovation. Furthermore, deviant practices that leverage non-institutionalized actors’ online social knowledge to fulfill customers’ requirements are not interpreted negatively, but as a positive willingness to proactively explore alternative paths.
Research limitations/implications
– The findings demonstrate the importance of dissident non-institutionalized actors in technology management innovation. However, this work is based on a single country (USA) and additional research is needed to validate and generalize the findings in other cultural and institutional settings.
Originality/value
– This paper provides new insights into the perceptions of dissident non-institutionalized actors in the practice of IT managerial decision making. The work departs from, but also extends, the previous literature, demonstrating that peripherally deviant work in solution-finding practice creates tensions, enabling management innovation between IT providers and users.
Resumo:
Currently there is extensive theoretical work on inconsistencies in logic-based systems. Recently, algorithms for identifying inconsistent clauses in a single conjunctive formula have demonstrated that practical application of this work is possible. However, these algorithms have not been extended for full knowledge base systems and have not been applied to real-world knowledge. To address these issues, we propose a new algorithm for finding the inconsistencies in a knowledge base using existing algorithms for finding inconsistent clauses in a formula. An implementation of this algorithm is then presented as an automated tool for finding inconsistencies in a knowledge base and measuring the inconsistency of formulae. Finally, we look at a case study of a network security rule set for exploit detection (QRadar) and suggest how these automated tools can be applied.
Resumo:
Purpose: A systematic review of the validity, reliability and sensitivity of the Short Form (SF) health survey measures among breast cancer survivors.
Methods: We searched a number of databases for peer-reviewed papers. The methodological quality of the papers was assessed using the COnsenus-based Standards for the selection of health Measurement INstruments (COSMIN).
Results: The review identified seven papers that assessed the psychometric properties of the SF-36 (n = 5), partial SF-36 (n = 1) and SF-12 (n = 1) among breast cancer survivors. Internal consistency scores for the SF measures ranged from acceptable to good across a range of language and ethnic sub-groups. The SF-36 demonstrated good convergent validity with respective subscales of the Functional Assessment of Cancer Treatment—General scale and two lymphedema-specific measures. Divergent validity between the SF-36 and Lymph-ICF was modest. The SF-36 demonstrated good factor structure in the total breast cancer survivor study samples. However, the factor structure appeared to differ between specific language and ethnic sub-groups. The SF-36 discriminated between survivors who reported or did not report symptoms on the Breast Cancer Prevention Trial Symptom Checklist and SF-36 physical sub-scales, but not mental sub-scales, discriminated between survivors with or without lymphedema. Methodological quality scores varied between and within papers.
Conclusion: Short Form measures appear to provide a reliable and valid indication of general health status among breast cancer survivors though the limited data suggests that particular caution is required when interpreting scores provided by non-English language groups. Further research is required to test the sensitivity or responsiveness of the measure.
Resumo:
Many graph datasets are labelled with discrete and numeric attributes. Most frequent substructure discovery algorithms ignore numeric attributes; in this paper we show how they can be used to improve search performance and discrimination. Our thesis is that the most descriptive substructures are those which are normative both in terms of their structure and in terms of their numeric values. We explore the relationship between graph structure and the distribution of attribute values and propose an outlier-detection step, which is used as a constraint during substructure discovery. By pruning anomalous vertices and edges, more weight is given to the most descriptive substructures. Our method is applicable to multi-dimensional numeric attributes; we outline how it can be extended for high-dimensional data. We support our findings with experiments on transaction graphs and single large graphs from the domains of physical building security and digital forensics, measuring the effect on runtime, memory requirements and coverage of discovered patterns, relative to the unconstrained approach.
Resumo:
There is extensive theoretical work on measures of inconsistency for arbitrary formulae in knowledge bases. Many of these are defined in terms of the set of minimal inconsistent subsets (MISes) of the base. However, few have been implemented or experimentally evaluated to support their viability, since computing all MISes is intractable in the worst case. Fortunately, recent work on a related problem of minimal unsatisfiable sets of clauses (MUSes) offers a viable solution in many cases. In this paper, we begin by drawing connections between MISes and MUSes through algorithms based on a MUS generalization approach and a new optimized MUS transformation approach to finding MISes. We implement these algorithms, along with a selection of existing measures for flat and stratified knowledge bases, in a tool called mimus. We then carry out an extensive experimental evaluation of mimus using randomly generated arbitrary knowledge bases. We conclude that these measures are viable for many large and complex random instances. Moreover, they represent a practical and intuitive tool for inconsistency handling.
Resumo:
The title of this short (about 4500 words) intervention translates to "To Nail a Jellyfish? Finding a progressive agenda for EU anti-discrimination law". I engage with those criticising EU anti-discrimination law as yet another emanation of the EU's "neo-liberal" nature which fails to establish a viable social policy regime. I criticise this in two directions. First, I take issue with the theory that anti-discrimination law and policy has to be part of social policy. Actually, the field has a mission which differs from social policy, in that it addresses disadvantage resulting from othering, combating stereotypes as well as promoting accomodation of difference. Second, I show how the critique of judicialisation of policy is not unique to anti-discrimination law and policy. The so called turn to rights based employment law has been criticised under this mantra by those who fear that collective labour law mechanisms will become less prevalent. Further, those who have engaged with anti-discrimination law for a much longer time than those criticising it have also devised means to overcome the individualistic tendencies of rights adjudication. They have (partly successfully) argued in favour of establishing equality bodies and creating positive obligations. Thus, the critique neglects the field it takes on, and does not accept the fact that anti-discrimination law and policy must be considered a field in its own right instead of the servant of social law and policy.
Now, this is more a summary than an abstract - since I realise that not everyone reads German.
Resumo:
There is now a strong body of research that suggests that the form of the built environment can influence levels of physical activity, leading to an increasing interest in incorporating health objectives into spatial planning and regeneration policies and projects. There have been a number of strands to this research, one of which has sought to develop “objective” measurements of the built environment using Geographic Information Science (GIS) involving measures of connectivity and proximity to compare the relative “walkability” of different neighbourhoods. The development of the “walkability index” (e.g. Leslie et al 2007, Frank et al 2010) has become a popular indicator of spatial distribution of those features of the built environment that are considered to have the greatest positive influence on levels of physical activity. The success of this measure is built on its ability to succinctly capture built environment correlates of physical activity using routinely available spatial data, which includes using road centre lines as a basis of a proxy for connectivity.
This paper discusses two key aspects of the walkability index. First, it follows the suggestion of Chin et al (2008) that the use of a footpath network (where available), rather than road centre lines, may be far more effective in evaluating walkability. This may be particularly important for assessing changes in walkability arising from pedestrian-focused infrastructure projects, such as greenways. Second, the paper explores the implication of this for how connectivity can be measured. The paper takes six different measures of connectivity and first analyses the relationships between them and then tests their correlation with actual levels of physical activity of local residents in Belfast, Northern Ireland. The analysis finds that the best measurements appear to be intersection density and metric reach and uses this finding to discuss the implications of this for developing tools that may better support decision-making in spatial planning.