7 resultados para Mean Inter-Arrival Claim Intensity
em Aston University Research Archive
Resumo:
The full set of partial structure factors for glassy germania, or GeO2, were accurately measured by using the method of isotopic substitution in neutron diffraction in order to elucidate the nature of the pair correlations for this archetypal strong glass former. The results show that the basic tetrahedral Ge(O-1/2)(4) building blocks share corners with a mean inter-tetrahedral Ge-O-Ge bond angle of 132(2)degrees. The topological and chemical ordering in the resultant network displays two characteristic length scales at distances greater than the nearest neighbour. One of these describes the intermediate range order, and manifests itself by the appearance of a first sharp diffraction peak in the measured diffraction patterns at a scattering vector k(FSDP) approximate to 1.53 angstrom(-1), while the other describes so-called extended range order, and is associated with the principal peak at k(PP) = 2.66( 1) angstrom(-1). We find that there is an interplay between the relative importance of the ordering on these length scales for tetrahedral network forming glasses that is dominated by the extended range ordering with increasing glass fragility. The measured partial structure factors for glassy GeO2 are used to reproduce the total structure factor measured by using high energy x-ray diffraction and the experimental results are also compared to those obtained by using classical and first principles molecular dynamics simulations.
Resumo:
Guest editorial: This special issue has been drawn from papers that were published as part of the Second European Conference on Management of Technology (EuroMOT) which was held at Aston Business School (Birmingham, UK) 10-12 September 2006. This was the official European conference for the International Association for Management of Technology (IAMOT); the overall theme of the conference was “Technology and global integration.” There were many high-calibre papers submitted to the conference and published in the associated proceedings (Bennett et al., 2006). The streams of interest that emerged from these submissions were the importance of: technology strategy, innovation, process technologies, managing change, national policies and systems, research and development, supply chain technology, service and operational technology, education and training, small company incubation, technology transfer, virtual operations, technology in developing countries, partnership and alliance, and financing and investment. This special issue focuses upon the streams of interest that accentuate the importance of collaboration between different organisations. Such organisations vary greatly in character; for instance, they may be large or small, publicly or privately owned, and operate in manufacturing or service sectors. Despite these varying characteristics they all have something in common; they all stress the importance of inter-organisational collaboration as a critical success factor for their organisation. In today's global economy it is essential that organisations decide what their core competencies are what those of complementing organisations are. Core competences should be developed to become a bases of differentiation, leverage and competitive advantage, whilst those that are less mature should be outsourced to other organisations that can claim to have had more recognition and success in that particular core competence (Porter, 2001). This strategic trend can be observed throughout advanced economies and is growing strongly. If a posteriori reasoning is applied here it follows that organisations could continue to become more specialised in fewer areas whilst simultaneously becoming more dependent upon other organisations for critical parts of their operations. Such actions seem to fly in the face of rational business strategy and so the question must be asked: why are organisations developing this way? The answer could lie in the recent changes in endogenous and exogenous factors of the organisation; the former emphasising resource-based issues in the short-term, and strategic positioning in the long-term whilst the later emphasises transaction costs in the short-term and acquisition of new skills and knowledge in the long-term. For a harmonious balance of these forces to prevail requires organisations to firstly declare a shared meta-strategy, then to put some cross-organisational processes into place which have their routine operations automated as far as possible. A rolling business plan would review, assess and reposition each organisation within this meta-strategy according to how well they have contributed (Binder and Clegg, 2006). The important common issue here is that an increasing number of businesses today are gaining direct benefit from increasing their levels of inter-organisational collaboration. Such collaboration has largely been possible due to recent technological advances which can make organisational structures more agile (e.g. the extended or the virtual enterprise), organisational infra-structure more connected, and the sharing of real-time information an operational reality. This special issue consists of research papers that have explored the above phenomenon in some way. For instance, the role of government intervention, the use of internet-based technologies, the role of research and development organisations, the changing relationships between start-ups and established firms, the importance of cross-company communities of practice, the practice of networking, the front-loading of large-scale projects, innovation and the probabilistic uncertainties that organisations experience are explored in these papers. The cases cited in these papers are limited as they have a Eurocentric focus. However, it is hoped that readers of this special issue will gain a valuable insight into the increasing importance of collaborative practices via these studies.
Resumo:
The concept of plagiarism is not uncommonly associated with the concept of intellectual property, both for historical and legal reasons: the approach to the ownership of ‘moral’, nonmaterial goods has evolved to the right to individual property, and consequently a need was raised to establish a legal framework to cope with the infringement of those rights. The solution to plagiarism therefore falls most often under two categories: ethical and legal. On the ethical side, education and intercultural studies have addressed plagiarism critically, not only as a means to improve academic ethics policies (PlagiarismAdvice.org, 2008), but mainly to demonstrate that if anything the concept of plagiarism is far from being universal (Howard & Robillard, 2008). Even if differently, Howard (1995) and Scollon (1994, 1995) argued, and Angèlil-Carter (2000) and Pecorari (2008) later emphasised that the concept of plagiarism cannot be studied on the grounds that one definition is clearly understandable by everyone. Scollon (1994, 1995), for example, claimed that authorship attribution is particularly a problem in non-native writing in English, and so did Pecorari (2008) in her comprehensive analysis of academic plagiarism. If among higher education students plagiarism is often a problem of literacy, with prior, conflicting social discourses that may interfere with academic discourse, as Angèlil-Carter (2000) demonstrates, we then have to aver that a distinction should be made between intentional and inadvertent plagiarism: plagiarism should be prosecuted when intentional, but if it is part of the learning process and results from the plagiarist’s unfamiliarity with the text or topic it should be considered ‘positive plagiarism’ (Howard, 1995: 796) and hence not an offense. Determining the intention behind the instances of plagiarism therefore determines the nature of the disciplinary action adopted. Unfortunately, in order to demonstrate the intention to deceive and charge students with accusations of plagiarism, teachers necessarily have to position themselves as ‘plagiarism police’, although it has been argued otherwise (Robillard, 2008). Practice demonstrates that in their daily activities teachers will find themselves being required a command of investigative skills and tools that they most often lack. We thus claim that the ‘intention to deceive’ cannot inevitably be dissociated from plagiarism as a legal issue, even if Garner (2009) asserts that generally plagiarism is immoral but not illegal, and Goldstein (2003) makes the same severance. However, these claims, and the claim that only cases of copyright infringement tend to go to court, have recently been challenged, mainly by forensic linguists, who have been actively involved in cases of plagiarism. Turell (2008), for instance, demonstrated that plagiarism is often connoted with an illegal appropriation of ideas. Previously, she (Turell, 2004) had demonstrated by comparison of four translations of Shakespeare’s Julius Caesar to Spanish that the use of linguistic evidence is able to demonstrate instances of plagiarism. This challenge is also reinforced by practice in international organisations, such as the IEEE, to whom plagiarism potentially has ‘severe ethical and legal consequences’ (IEEE, 2006: 57). What plagiarism definitions used by publishers and organisations have in common – and which the academia usually lacks – is their focus on the legal nature. We speculate that this is due to the relation they intentionally establish with copyright laws, whereas in education the focus tends to shift from the legal to the ethical aspects. However, the number of plagiarism cases taken to court is very small, and jurisprudence is still being developed on the topic. In countries within the Civil Law tradition, Turell (2008) claims, (forensic) linguists are seldom called upon as expert witnesses in cases of plagiarism, either because plagiarists are rarely taken to court or because there is little tradition of accepting linguistic evidence. In spite of the investigative and evidential potential of forensic linguistics to demonstrate the plagiarist’s intention or otherwise, this potential is restricted by the ability to identify a text as being suspect of plagiarism. In an era with such a massive textual production, ‘policing’ plagiarism thus becomes an extraordinarily difficult task without the assistance of plagiarism detection systems. Although plagiarism detection has attracted the attention of computer engineers and software developers for years, a lot of research is still needed. Given the investigative nature of academic plagiarism, plagiarism detection has of necessity to consider not only concepts of education and computational linguistics, but also forensic linguistics. Especially, if intended to counter claims of being a ‘simplistic response’ (Robillard & Howard, 2008). In this paper, we use a corpus of essays written by university students who were accused of plagiarism, to demonstrate that a forensic linguistic analysis of improper paraphrasing in suspect texts has the potential to identify and provide evidence of intention. A linguistic analysis of the corpus texts shows that the plagiarist acts on the paradigmatic axis to replace relevant lexical items with a related word from the same semantic field, i.e. a synonym, a subordinate, a superordinate, etc. In other words, relevant lexical items were replaced with related, but not identical, ones. Additionally, the analysis demonstrates that the word order is often changed intentionally to disguise the borrowing. On the other hand, the linguistic analysis of linking and explanatory verbs (i.e. referencing verbs) and prepositions shows that these have the potential to discriminate instances of ‘patchwriting’ and instances of plagiarism. This research demonstrates that the referencing verbs are borrowed from the original in an attempt to construct the new text cohesively when the plagiarism is inadvertent, and that the plagiarist has made an effort to prevent the reader from identifying the text as plagiarism, when it is intentional. In some of these cases, the referencing elements prove being able to identify direct quotations and thus ‘betray’ and denounce plagiarism. Finally, we demonstrate that a forensic linguistic analysis of these verbs is critical to allow detection software to identify them as proper paraphrasing and not – mistakenly and simplistically – as plagiarism.
Resumo:
We evaluated inter-individual variability in optimal current direction for biphasic transcranial magnetic stimulation (TMS) of the motor cortex. Motor threshold for first dorsal interosseus was detected visually at eight coil orientations in 45° increments. Each participant (n = 13) completed two experimental sessions. One participant with low test–retest correlation (Pearson's r < 0.5) was excluded. In four subjects, visual detection of motor threshold was compared to EMG detection; motor thresholds were very similar and highly correlated (0.94–0.99). Similar with previous studies, stimulation in the majority of participants was most effective when the first current pulse flowed towards postero-lateral in the brain. However, in four participants, the optimal coil orientation deviated from this pattern. A principal component analysis using all eight orientations suggests that in our sample the optimal orientation of current direction was normally distributed around the postero-lateral orientation with a range of 63° (S.D. = 13.70°). Whenever the intensity of stimulation at the target site is calculated as a percentage from the motor threshold, in order to minimize intensity and side-effects it may be worthwhile to check whether rotating the coil 45° from the traditional posterior–lateral orientation decreases motor threshold.
Resumo:
Hydrogen assisted subcritical cleavage of the ferrite matrix occurs during fatigue of a duplex stainless steel in gaseous hydrogen. The ferrite fails by a cyclic cleavage mechanism and fatigue crack growth rates are independent of frequency between 0.1 and 5 Hz. Macroscopic crack growth rates are controlled by the fraction of ferrite grains cleaving along the crack front, which can be related to the maximum stress intensity, Kmax. A superposition model is developed to predict simultaneously the effects of stress intensity range (ΔK) and K ratio (Kmin/Kmax). The effect of Kmax is rationalised by a local cleavage criterion which requires a critical tensile stress, normal to the {001} cleavage plane, acting over a critical distance within an embrittled zone at the crack tip. © 1991.
Resumo:
PURPOSE. To establish the optimal flash settings for retinal vessel oxygen saturation parameters using dual-wavelength imaging in a multiethnic group. METHODS. Twelve healthy young subjects (mean age 32 years [SD 7]; three Mediterranean, two South Asian, and seven Caucasian individuals) underwent retinal vessel oxygen saturation measurements using dual-wavelength oximetry, noncontact tonometry, and manual sphygmomanometry. In order to evaluate the impact of flash intensity, we obtained three images (fundus camera angle 30°, ONH centered) per flash setting. Flash settings of the fundus camera were increased in steps of 2 (initial setting of 6 and the final of 22), which reflect logarithmic increasing intensities from 13.5 to 214 Watt seconds (Ws). RESULTS. Flash settings below 27 Ws were too low to obtain saturation measurements, whereas flash settings of more than 214 Ws resulted in overexposed images. Retinal arteriolar and venular oxygen saturation was comparable at flash settings of 27 to 76 Ws (arterioles' range: 85%-92%; venules' range: 45%-53%). Higher flash settings lead to increased saturation measurements in both retinal arterioles (up to 110%) and venules (up to 92%), with a more pronounced increase in venules. CONCLUSIONS. Flash intensity has a significant impact on retinal vessel oxygen saturation measurements using dual-wavelength retinal oximetry. High flash intensities lead to supranormal oxygen saturation measurements with a magnified effect in retinal venules compared with arteries. In addition to even retinal illumination, the correct flash setting is of paramount importance for clinical acquisition of images in retinal oximetry. We recommend flash settings between 27 to 76 Ws. © 2013 The Association for Research in Vision and Ophthalmology, Inc.
Resumo:
Many tracking algorithms have difficulties dealing with occlusions and background clutters, and consequently don't converge to an appropriate solution. Tracking based on the mean shift algorithm has shown robust performance in many circumstances but still fails e.g. when encountering dramatic intensity or colour changes in a pre-defined neighbourhood. In this paper, we present a robust tracking algorithm that integrates the advantages of mean shift tracking with those of tracking local invariant features. These features are integrated into the mean shift formulation so that tracking is performed based both on mean shift and feature probability distributions, coupled with an expectation maximisation scheme. Experimental results show robust tracking performance on a series of complicated real image sequences. © 2010 IEEE.