178 resultados para Wyndham, Guy Percy, 1865-
Resumo:
Collisions between different road users make a substantial contribution to road trauma. Although evidence suggests that different road users interpret the same road situations differently, it is not clear how road users' situation awareness differs, nor is it clear which differences might lead to conflicts. This article presents the findings from an on-road study conducted to examine driver, motorcyclist and cyclist situation awareness in different road environments. The findings suggest that, in addition to minor differences in the structure of different road users' situation awareness (i.e. amount of information and how it is integrated), the actual content of situation awareness in terms of road user schemata, the resulting interaction with the world and the information underpinning situation awareness is markedly different. Further examination indicates that the differences are likely to be compatible along arterial roads, shopping strips and at roundabouts, but that they may create conflicts between different road users at intersections. Interventions designed to support compatible situation awareness and behaviour between different road users are discussed. Practitioner Summary: Incompatible situation awareness plays a key role in collisions between different road users (e.g. drivers and motorcyclists). This on-road study examined situation awareness in drivers, motorcyclists and cyclists, identifying the key differences and potential conflicts that arise. The findings are used to propose interventions designed to enhance the compatibility of situation awareness between road users.
Resumo:
The detection and correction of defects remains among the most time consuming and expensive aspects of software development. Extensive automated testing and code inspections may mitigate their effect, but some code fragments are necessarily more likely to be faulty than others, and automated identification of fault prone modules helps to focus testing and inspections, thus limiting wasted effort and potentially improving detection rates. However, software metrics data is often extremely noisy, with enormous imbalances in the size of the positive and negative classes. In this work, we present a new approach to predictive modelling of fault proneness in software modules, introducing a new feature representation to overcome some of these issues. This rank sum representation offers improved or at worst comparable performance to earlier approaches for standard data sets, and readily allows the user to choose an appropriate trade-off between precision and recall to optimise inspection effort to suit different testing environments. The method is evaluated using the NASA Metrics Data Program (MDP) data sets, and performance is compared with existing studies based on the Support Vector Machine (SVM) and Naïve Bayes (NB) Classifiers, and with our own comprehensive evaluation of these methods.
Resumo:
Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.
Resumo:
Because of their limited number of senior positions and fewer alternative career paths, small businesses have a more difficult time attracting and retaining skilled information systems (IS) staff and are thus dependent upon external expertise. Small businesses are particularly dependent on outside expertise when first computerizing. Because small businesses suffer from severe financial constraints. it is often difficult to justify the cost of custom software. Hence. for many small businesses, engaging a consultant to help with identifying suitable packaged software and related hardware, is their first critical step toward computerization. This study explores the importance of proactive client involvement when engaging a consultant to assist with computer system selection in small businesses. Client involvement throughout consultant engagement is found to be integral to project success and frequently lacking due to misconceptions of small businesses regarding their role. Small businesses often overestimate the impact of consultant and vendor support in achieving successful computer system selection and implementation. For consultant engagement to be successful, the process must be viewed as being directed toward the achievement of specific organizational results where the client accepts responsibility for direction of the process.
Resumo:
Too often the relationship between client and external consultants is perceived as one of protagonist versus antogonist. Stories on dramatic, failed consultancies abound, as do related anecdotal quips. A contributing factor to many "apparently" failed consultancies is a poor appreciation by both the client and consultant of the client's true goals for the project and how to assess progress toward these goals. This paper presents and analyses a measurement model for assessing client success when engaging an external consultant. Three main areas of assessment are identified: (1) the consultant;s recommendations, (2) client learning, and (3) consultant performance. Engagement success is emperically measured along these dimensions through a series of case studies and a subsequent survey of clients and consultants involved in 85 computer-based information system selection projects. Validation fo the model constructs suggests the existence of six distinct and individually important dimensions of engagement success. both clients and consultants are encouraged to attend to these dimensions in pre-engagement proposal and selection processes, and post-engagement evaluation of outcomes.
Resumo:
The ability to understand and predict how thermal, hydrological,mechanical and chemical (THMC) processes interact is fundamental to many research initiatives and industrial applications. We present (1) a new Thermal– Hydrological–Mechanical–Chemical (THMC) coupling formulation, based on non-equilibrium thermodynamics; (2) show how THMC feedback is incorporated in the thermodynamic approach; (3) suggest a unifying thermodynamic framework for multi-scaling; and (4) formulate a new rationale for assessing upper and lower bounds of dissipation for THMC processes. The technique is based on deducing time and length scales suitable for separating processes using a macroscopic finite time thermodynamic approach. We show that if the time and length scales are suitably chosen, the calculation of entropic bounds can be used to describe three different types of material and process uncertainties: geometric uncertainties,stemming from the microstructure; process uncertainty, stemming from the correct derivation of the constitutive behavior; and uncertainties in time evolution, stemming from the path dependence of the time integration of the irreversible entropy production. Although the approach is specifically formulated here for THMC coupling we suggest that it has a much broader applicability. In a general sense it consists of finding the entropic bounds of the dissipation defined by the product of thermodynamic force times thermodynamic flux which in material sciences corresponds to generalized stress and generalized strain rates, respectively.
Resumo:
Purpose. To quantify the molecular lipid composition of patient-matched tear and meibum samples and compare tear and meibum lipid molecular profiles. Methods. Lipids were extracted from tears and meibum by bi-phasic methods using 10:3 tertbutyl methyl ether:methanol, washed with aqueous ammonium acetate, and analyzed by chipbased nanoelectrospray ionization tandem mass spectrometry. Targeted precursor ion and neutral loss scans identified individual molecular lipids and quantification was obtained by comparison to internal standards in each lipid class. Results. Two hundred and thirty-six lipid species were identified and quantified from nine lipid classes comprised of cholesterol esters, wax esters, (O-acyl)-x-hydroxy fatty acids, triacylglycerols, phosphatidylcholine, lysophosphatidylcholine, phosphatidylethanolamine, sphingomyelin, and phosphatidylserine. With the exception of phospholipids, lipid molecular profiles were strikingly similar between tears and meibum. Conclusions. Comparisons between tears and meibum indicate that meibum is likely to supply the majority of lipids in the tear film lipid layer. However, the observed higher mole ratio of phospholipid in tears shows that analysis of meibum alone does not provide a complete understanding of the tear film lipid composition.
Resumo:
Background The behaviour of tumour cells depends on factors such as genetics and the tumour microenvironment. The latter plays a crucial role in normal mammary gland development and also in breast cancer initiation and progression. Breast cancer tissues tend to be highly desmoplastic and dense matrix as a pre-existing condition poses one of the highest risk factors for cancer development. However, matrix influence on tumour cell gene expression and behaviour such as cell migration is not fully elucidated. Results We generated high-density (HD) matrices that mimicked tumour collagen content of 20 mg/cm3 that were ~14-fold stiffer than low-density (LD) matrix of 1 mg/cm3. Live-cell imaging showed breast cancer cells utilizing cytoplasmic streaming and cell body contractility for migration within HD matrix. Cell migration was blocked in the presence of both the ROCK inhibitor, Y-27632, and the MMP inhibitor, GM6001, but not by the drugs individually. This suggests roles for ROCK1 and MMP in cell migration are complicated by compensatory mechanisms. ROCK1 expression and protein activity, were significantly upregulated in HD matrix but these were blocked by treatment with a histone deacetylase (HDAC) inhibitor, MS-275. In HD matrix, the inhibition of ROCK1 by MS-275 was indirect and relied upon protein synthesis and Notch1. Inhibition of Notch1 using pooled siRNA or DAPT abrogated the inhibition of ROCK1 by MS-275. Conclusion Increased matrix density elevates ROCK1 activity, which aids in cell migration via cell contractility. The upregulation of ROCK1 is epigenetically regulated in an indirect manner involving the repression of Notch1. This is demonstrated from inhibition of HDACs by MS-275, which caused an upregulation of Notch1 levels leading to blockade of ROCK1 expression.
Resumo:
We report the synthesis, structure and properties of [2]rotaxanes prepared by the assembly of benzylic amide macrocycles around a series of amide and sulfide-/sulfoxide-/sulfone-containing threads. The efficacy of rotaxane formation is related to the hydrogen bond accepting properties of the various sulfur-containing functional groups in the thread, with the highest yields (up to 63% with a rigid vinyl spacer in the template site) obtained for sulfoxide rotaxanes. X-Ray crystallography of a sulfoxide rotaxane, 5, shows that the macrocycle adopts a highly symmetrical chair-like conformation in the solid state, with short hydrogen bonds between the macrocycle isophthalamide NH-protons and the amide carbonyl and sulfoxide S-O of the thread. In contrast, in the X-ray crystal structures of the analogous sulfide (4) and sulfone (6) rotaxanes the macrocycle adopts boat-like conformations with long intercomponent NH…O=SO and NH…S hydrogen bonds (in addition to several intercomponent amide-amide hydrogen bonds). Taking advantage of the different hydrogen bonding modes of the sulfur-based functional groups, a switchable molecular shuttle was prepared in which the oxidation level of sulfur determines the position of the macrocycle on the thread.
Resumo:
In the TREC Web Diversity track, novelty-biased cumulative gain (α-NDCG) is one of the official measures to assess retrieval performance of IR systems. The measure is characterised by a parameter, α, the effect of which has not been thoroughly investigated. We find that common settings of α, i.e. α=0.5, may prevent the measure from behaving as desired when evaluating result diversification. This is because it excessively penalises systems that cover many intents while it rewards those that redundantly cover only few intents. This issue is crucial since it highly influences systems at top ranks. We revisit our previously proposed threshold, suggesting α be set on a query-basis. The intuitiveness of the measure is then studied by examining actual rankings from TREC 09-10 Web track submissions. By varying α according to our query-based threshold, the discriminative power of α-NDCG is not harmed and in fact, our approach improves α-NDCG's robustness. Experimental results show that the threshold for α can turn the measure to be more intuitive than using its common settings.
Resumo:
This paper elaborates on the use of future wireless communication networks for autonomous city vehicles. After addressing the state of technology, the paper explains the autonomous vehicle control system architecture and the Cybercars-2 communication framework; it presents experimental tests of communication-based real-time decision making; and discusses potential applications for communication in order to improve the localization and perception abilities of autonomous vehicles in urban environments.
Resumo:
While organizations strive to leverage the vast information generated daily from social media platforms and both decision makers and consultants are keen to identify and exploit this information’s value, there has been little research into social media in the business context. Social media are diverse, varying in scope and functionality, this diversity entailing a complex of attributes and characteristics, resulting in confusion for both researchers and organizations. Taxonomies are important precursors in emerging fields and are foundational for rigorous theory building. Though aspects of social media have been studied from various discipline perspectives, this work has been largely descriptive. Thus, while the need for a rigorous taxonomy of social media is strong, previous efforts to classify social media suffer limitations – e.g. lack of a systematic taxonomic method, overreliance on intuition, disregard for the users’ perspective, and inadequate consideration of purpose. Thus, this study was mainly initiated by the overarching question “How can social media in the business context be usefully classified?” In order to address this gap, the current paper proposes a systematic method for developing a taxonomy appropriate to study social media in organizations context, combining Nickerson et al,’s (2012) IS taxonomy building guidelines and a Repertory grid (RepGrid) approach.
Resumo:
For Design Science Research (DSR) to gain wide credence as a research paradigm in Information Systems (IS), it must contribute to theory. “Theory cannot be improved until we improve the theorizing process, and we cannot improve the theorizing process until we describe it more explicitly, operate it more self-consciously, and decouple it from validation more deliberately” (Weick 1989, p. 516). With the aim of improved design science theorizing, we propose a DSR abstraction-layers framework that integrates, interlates, and harmonizes key methodological notions, primary of which are: 1) the Design Science (DS), Design Research (DR), and Routine Design (RD) distinction (Winter 2008); 2) Multi Grounding in IS Design Theory (ISDT) (Goldkuhl & Lind 2010); 3) the Idealized Model for Theory Development (IM4TD) (Fischer & Gregor 2011); and 4) the DSR Theorizing Framework (Lee et al. 2011). Though theorizing, or the abstraction process, has been the subject of healthy discussion in DSR, important questions remain. With most attention to date having focused on theorizing for Design Research (DR), a key stimulus of the layered view was the realization that Design Science (DS) produces abstract knowledge at a higher level of generality. The resultant framework includes four abstraction layers: (i) Design Research (DR) 1st Abstract Layer, (ii) Design Science (DS) 2nd Abstract Layer, (iii) DSR Incubation 3rd Layer, and (iv) Routine Design 4th Layer. Differentiating and inter-relating these layers will aid DSR researchers to discover, position, and amplify their DSR contributions. Additionally, consideration of the four layers can trigger creative perspectives that suggest unplanned outputs. The first abstraction layer, including its alternative patterns of activity, is well recognized in the literature. The other layers, however, are less well recognized; and the integrated representation of layers is novel.
Resumo:
An increasing range of services are now offered via online applications and e-commerce websites. However, problems with online services still occur at times, even for the best service providers due to the technical failures, informational failures, or lack of required website functionalities. Also, the widespread and increasing implementation of web services means that service failures are both more likely to occur, and more likely to have serious consequences. In this paper we first develop a digital service value chain framework based on existing service delivery models adapted for digital services. We then review current literature on service failure prevention, and provide a typology of technolo- gies and approaches that can be used to prevent failures of different types (functional, informational, system), that can occur at different stages in the web service delivery. This makes a contribution to theory by relating specific technologies and technological approaches to the point in the value chain framework where they will have the maximum impact. Our typology can also be used to guide the planning, justification and design of robust, reliable web services.