18 resultados para Architecture and software patterns
em Aston University Research Archive
Resumo:
OBJECTIVES: The objective of this research was to design a clinical decision support system (CDSS) that supports heterogeneous clinical decision problems and runs on multiple computing platforms. Meeting this objective required a novel design to create an extendable and easy to maintain clinical CDSS for point of care support. The proposed solution was evaluated in a proof of concept implementation. METHODS: Based on our earlier research with the design of a mobile CDSS for emergency triage we used ontology-driven design to represent essential components of a CDSS. Models of clinical decision problems were derived from the ontology and they were processed into executable applications during runtime. This allowed scaling applications' functionality to the capabilities of computing platforms. A prototype of the system was implemented using the extended client-server architecture and Web services to distribute the functions of the system and to make it operational in limited connectivity conditions. RESULTS: The proposed design provided a common framework that facilitated development of diversified clinical applications running seamlessly on a variety of computing platforms. It was prototyped for two clinical decision problems and settings (triage of acute pain in the emergency department and postoperative management of radical prostatectomy on the hospital ward) and implemented on two computing platforms-desktop and handheld computers. CONCLUSIONS: The requirement of the CDSS heterogeneity was satisfied with ontology-driven design. Processing of application models described with the help of ontological models allowed having a complex system running on multiple computing platforms with different capabilities. Finally, separation of models and runtime components contributed to improved extensibility and maintainability of the system.
Resumo:
Recent technological advances have paved the way for developing and offering advanced services for the stakeholders in the agricultural sector. A paradigm shift is underway from proprietary and monolithic tools to Internet-based, cloud hosted, open systems that will enable more effective collaboration between stakeholders. This new paradigm includes the technological support of application developers to create specialized services that will seamlessly interoperate, thus creating a sophisticated and customisable working environment for the end users. We present the implementation of an open architecture that instantiates such an approach, based on a set of domain independent software tools called "generic enablers" that have been developed in the context of the FI-WARE project. The implementation is used to validate a number of innovative concepts for the agricultural sector such as the notion of a services' market place and the system's adaptation to network failures. During the design and implementation phase, the system has been evaluated by end users, offering us valuable feedback. The results of the evaluation process validate the acceptance of such a system and the need of farmers to have access to sophisticated services at affordable prices. A summary of this evaluation process is also presented in this paper. © 2013 Elsevier B.V.
Resumo:
Journal rankings are frequently used as a measure of both journal and author research quality. Nonetheless, debates frequently arise because journal rankings do not take into account the underlying diversity of the finance research community. This study examines how factors such as a researcher's geographic origin, research interests, seniority, and journal affiliation influence journal quality perceptions and readership patterns. Based on a worldwide sample of 862 finance academics, we find remarkable consistency in the rankings of top journals. For the remaining journals, perception of journal quality differs depending on the researcher's geographic origin, research interests, seniority, and journal affiliation.
Resumo:
The aim of this research project is to compare published history textbooks written for upper-secondary/tertiary study in the U.S. and Spain using Halliday's (1994) Theme/Rheme construct. The motivation for using the Theme/Rheme construct to analyze professional texts in the two languages is two-fold. First of all, while there exists a multitude of studies at the grammatical and phonological levels between the two languages, very little analysis has been carried out in comparison at the level of text, beyond that of comparing L1/L2 student writing. Secondly, thematic considerations allow the analyst to highlight areas of textual organization in a systematic way for purposes of comparison. The basic hypothesis tested here rests on the premise that similarity in the social function of the texts results in similar Theme choice and thematic patterning across languages, barring certain linguistic constraints. The corpus for this study consists of 20 texts: 10 from various history textbooks published in the U.S. and 10 from various history textbooks published in Spain. The texts chosen represent a variety of authors, in order to control for author style or preference. Three overall areas of analysis were carried out, representing Halliday's (1994) three metafunctions: the ideational, the interpersonal and the textual. The ideational analysis shows similarities across the two corpora in terms of participant roles and circumstances as Theme, with a slight difference in participants involved in material processes, which is shown to reflect a minor difference in the construal of the field of history in the two cultures. The textual analysis shows overall similarities with respect to text organization, and the interpersonal analysis shows overall similarities as regards the downplay of discrepant interpretations of historical events as well as a low frequency of interactive textual features, manifesting the informational focus of the texts. At the same time, differences in results amongst texts within each of the corpora demonstrate possible effect of subject matter, in many cases, and individual author style in others. Overall, the results confirm that similarity in content, but above all in purpose and audience, result in texts which show similarities in textual features, setting aside certain grammatical constraints.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Research on organizational spaces has not considered the importance of collective memory for the process of investing meaning in corporate architecture. Employing an archival ethnography approach, practices of organizational remembering emerge as a way to shape the meanings associated with architectural designs. While the role of monuments and museums are well established in studies of collective memory, this research extends the concept of spatiality to the practices of organizational remembering that focus on a wider selection of corporate architecture. By analyzing the historical shift from colonial to modernist architecture for banks and retailers in Ghana and Nigeria in the 1950s and 1960s on the basis of documents and photographs from three different companies, this article shows how archival sources can be used to untangle the ways in which companies seek to ascribe meaning to their architectural output. Buildings allude to the past and the future in a range of complex ways that can be interpreted more fully by reference to the archival sources and the historical context of their creation. Social remembering has the potential to explain why and how buildings have meaning, while archival ethnography offers a new research approach to investigate changing organizational practices.
Resumo:
This paper examines investors' reactions to dividend reductions or omissions conditional on past earnings and dividend patterns for a sample of eighty-two U.S. firms that incurred an annual loss. We document that the market reaction for firms with long patterns of past earnings and dividend payouts is significantly more negative than for firms with lessestablished past earnings and dividends records. Our results can be explained by the following line of reasoning. First, consistent with DeAngelo, DeAngelo, and Skinner (1992), a loss following a long stream of earnings and dividend payments represents an unreliable indicator of future earnings. Thus, established firms have higher loss reliability than less-established firms. Second, because current earnings and dividend policy are a substitute source of means of forecasting future earnings, lower loss reliability increases the information content of dividend reductions. Therefore, given the presence of a loss, the longer the stream of prior earnings and dividend payments, (1) the lower the loss reliability and (2) the more reliably dividend cuts are perceived as an indication that earnings difficulties will persist in the future.
Resumo:
Lowering glucose levels, while avoiding hypoglycaemia, can be challenging in insulin-treated patients with diabetes. We evaluated the role of ambulatory glucose profile in optimising glycaemic control in this population. Insulin-treated patients with type 1 and type 2 diabetes were recruited into a prospective, multicentre, 100-day study and randomised to control (n = 28) or intervention (n = 59) groups. The intervention group used ambulatory glucose profile, generated by continuous glucose monitoring, to assess daily glucose levels, whereas the controls relied on capillary glucose testing. Patients were reviewed at days 30 and 45 by the health care professional to adjust insulin therapy. Comparing first and last 2 weeks of the study, ambulatory glucose profile-monitored type 2 diabetes patients (n = 28) showed increased time in euglycaemia (mean ± standard deviation) by 1.4 ± 3.5 h/day (p = 0.0427) associated with reduction in HbA1c from 77 ± 15 to 67 ± 13 mmol/mol (p = 0.0002) without increased hypoglycaemia. Type 1 diabetes patients (n = 25) showed reduction in hypoglycaemia from 1.4 ± 1.7 to 0.8 ± 0.8 h/day (p = 0.0472) associated with a marginal HbA1c decrease from 75 ± 10 to 72 ± 8 mmol/mol (p = 0.0508). Largely similar findings were observed comparing intervention and control groups at end of study. In conclusion, ambulatory glucose profile helps glycaemic management in insulin-treated diabetes patients by increasing time spent in euglycaemia and decreasing HbA1c in type 2 diabetes patients, while reducing hypoglycaemia in type 1 diabetes patients.
Resumo:
This study pursues two objectives: first, to provide evidence on the information content of dividend policy, conditional on past earnings and dividend patterns prior to an annual earnings decline; second, to examine the effect of the magnitude of low earnings realizations on dividend policy when firms have more-or-less established dividend payouts. The information content of dividend policy for firms that incur earnings reductions following long patterns of positive earnings and dividends has been examined (DeAngelo et al., 1992, 1996; Charitou, 2000). No research has examined the association between the informativeness of dividend policy changes in the event of an earnings drop, relative to varying patterns of past earnings and dividends. Our dataset consists of 4,873 U.S. firm-year observations over the period 1986-2005. Our evidence supports the hypotheses that, among earnings-reducing or loss firms, longer patterns of past earnings and dividends: (a) strengthen the information conveyed by dividends regarding future earnings, and (b) enhance the role of the magnitude of low earnings realizations in explaining dividend policy decisions, in that earnings hold more information content that explains the likelihood of dividend cuts the longer the past earnings and dividend patterns. Both results stem from the stylized facts that managers aim to maintain consistency with respect to historic payout policy, being reluctant to proceed with dividend reductions, and that this reluctance is higher the more established is the historic payout policy. © 2010 The Authors. Journal compilation © 2010 Accounting Foundation, The University of Sydney.
Resumo:
The process framework comprises three phases, as follows: scope the supply chain/network; identify the options for supply system architecture and select supply system architecture. It facilitates a structured approach that analyses the supply chain/network contextual characteristics, in order to ensure alignment with the appropriate supply system architecture. The process framework was derived from comprehensive literature review and archival case study analysis. The review led to the classification of supply system architectures according to their orientation, whether integrated; partially integrated; co-ordinated or independent. The classification was combined with the characteristics that influence the selection of supply system architecture to encapsulate the conceptual framework. It builds upon existing frameworks and methodologies by focusing on structured procedure; supporting project management; facilitating participation and clarifying point of entry. The process framework was initially tested in three case study applications from the food, automobile and hand tool industries. A variety of industrial settings was chosen to illustrate transferability. The case study applications indicate that the process framework is a valid approach to the problem; however, further testing is required. In particular, the use of group support system technologies to support the process and the steps involving the participation of software vendors need further testing. However, the process framework can be followed due to the clarity of its presentation. It considers the issue of timing by including alternative decision-making techniques, dependent on the constraints. It is useful for ensuring a sound business case is developed, with supporting documentation and analysis that identifies the strategic and functional requirements of supply system architecture.
Resumo:
Cloud computing is a new technological paradigm offering computing infrastructure, software and platforms as a pay-as-you-go, subscription-based service. Many potential customers of cloud services require essential cost assessments to be undertaken before transitioning to the cloud. Current assessment techniques are imprecise as they rely on simplified specifications of resource requirements that fail to account for probabilistic variations in usage. In this paper, we address these problems and propose a new probabilistic pattern modelling (PPM) approach to cloud costing and resource usage verification. Our approach is based on a concise expression of probabilistic resource usage patterns translated to Markov decision processes (MDPs). Key costing and usage queries are identified and expressed in a probabilistic variant of temporal logic and calculated to a high degree of precision using quantitative verification techniques. The PPM cost assessment approach has been implemented as a Java library and validated with a case study and scalability experiments. © 2012 Springer-Verlag Berlin Heidelberg.
Exploring civil servant resistance to M-government:a story of transition and opportunities in Turkey
Resumo:
The concept of mobility, related to technology in particular, has evolved dramatically over the last two decades including: (i) hardware ranging from walkmans to Ipods, laptops to netbooks, PDAs to 3G mobile phone; (ii) software supporting multiple audio and video formats driven by ubiquitous mobile wireless access, WiMax, automations such as radio frequency ID tracking and location aware services. Against the background of increasing budget deficit, along with the imperative for efficiency gains, leveraging ICT and mobility promises for work related tasks, in a public administration context, in emerging markets, point to multiple possible paths. M-government transition involve both technological changes and adoption to deliver government services differently (e.g. 24/7, error free, anywhere to the same standards) but also the design of digital strategies including possibly competing m-government models, the re-shaping of cultural practices, the creation of m-policies and legislations, the structuring of m-services architecture, and progress regarding m-governance. While many emerging countries are already offering e-government services and are gearing-up for further m-government activities, little is actually known about the resistance that is encountered, as a reflection of civil servants' current standing, before any further macro-strategies are deployed. Drawing on the resistance and mobility literature, this chapter investigates how civil servants' behaviors, in an emerging country technological environment, through their everyday practice, react and resist the influence of m-government transition. The findings points to four main type of resistance namely: i) functional resistance; ii) ideological resistance; iii) market driven resistance and iv) geographical resistance. Policy implication are discussed in the specific context of emerging markets. © 2011, IGI Global.
Resumo:
Half a decade has passed since the objectives and benefits of autonomic computing were stated, yet even the latest system designs and deployments exhibit only limited and isolated elements of autonomic functionality. From an autonomic computing standpoint, all computing systems – old, new or under development – are legacy systems, and will continue to be so for some time to come. In this paper, we propose a generic architecture for developing fully-fledged autonomic systems out of legacy, non-autonomic components, and we investigate how existing technologies can be used to implement this architecture.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.