938 resultados para Reti calcolatori Protocolli comunicazione Gerarchie protocolli Software Defined Networking Internet
Resumo:
The papers in this issue focus our attention to packaged software as an increasingly important, but still relatively poorly understood phenomena in the information systems research community. The topic is not new: Lucas et al. (1988) wrote a provocative piece focused on the issues with implementing packaged software. A decade later, Carmel (1997) argued that packaged software was both ideally suited for American entrepreneurial activity and rapidly growing. The information systems research community, however, has moved more slowly to engage this change (e.g., Sawyer, 2001). The papers in this special issue represent a significant step in better engaging the issues of packaged software relative to information systems research, and highlighting opportunities for additional relevant research.
Resumo:
Enterprise resource planning (ERP) software is a dominant approach for dealing with legacy information system problems. In order to avoid invalidating maintenance and development support from the ERP vendor, most organizations reengineer their business processes in line with those implicit within the software. Regardless, some customization is typically required. This paper presents two case studies of ERP projects where customizations have been performed. The case analysis suggests that while customizations can give true organizational benefits, careful consideration is required to determine whether a customization is viable given its potential impact upon future maintenance. Copyright © 2001 John Wiley & Sons, Ltd.
Resumo:
Molecular-level computer simulations of restricted water diffusion can be used to develop models for relating diffusion tensor imaging measurements of anisotropic tissue to microstructural tissue characteristics. The diffusion tensors resulting from these simulations can then be analyzed in terms of their relationship to the structural anisotropy of the model used. As the translational motion of water molecules is essentially random, their dynamics can be effectively simulated using computers. In addition to modeling water dynamics and water-tissue interactions, the simulation software of the present study was developed to automatically generate collagen fiber networks from user-defined parameters. This flexibility provides the opportunity for further investigations of the relationship between the diffusion tensor of water and morphologically different models representing different anisotropic tissues.
Resumo:
Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.
Resumo:
Raman and thermo-Raman spectroscopy have been applied to study the mineral formerly known as manasseite now simply renamed as hydrotalcite-2H Mg6Al2(OH)16[CO3]⋅4H2O. The mineral is a member of the homonymous hydrotalcite supergroup. Hydrogen bond distances calculated using a Libowitzky-type empirical function varied between 2.61 and 3.00 Å. Stronger hydrogen bonds were formed by water units as compared to the hydroxyl units. Raman spectroscopy enabled the identification of bands attributed to the hydroxyl units. Two Raman bands at 1059 and 1064 cm-1 are assigned to symmetric stretching modes of the carbonate anion. Thermal treatment shifts these bands to higher wavenumbers indicating a change in the strength of the carbonate bonding.
Resumo:
Packaged software is pre-built with the intention of licensing it to users in domestic settings and work organisations. This thesis focuses upon the work organisation where packaged software has been characterised as one of the latest ‘solutions’ to the problems of information systems. The study investigates the packaged software selection process that has, to date, been largely viewed as objective and rational. In contrast, this interpretive study is based on a 21⁄2 year long field study of organisational experiences with packaged software selection at T.Co, a consultancy organisation based in the United Kingdom. Emerging from the iterative process of case study and action research is an alternative theory of packaged software selection. The research argues that packaged software selection is far from the rationalistic and linear process that previous studies suggest. Instead, the study finds that aspects of the traditional process of selection incorporating the activities of gathering requirements, evaluation and selection based on ‘best fit’ may or may not take place. Furthermore, even where these aspects occur they may not have equal weight or impact upon implementation and usage as may be expected. This is due to the influence of those multiple realities which originate from the organisational and market environments within which packages are created, selected and used, the lack of homogeneity in organisational contexts and the variously interpreted characteristics of the package in question.
Resumo:
The design of concurrent software systems, in particular process-aware information systems, involves behavioral modeling at various stages. Recently, approaches to behavioral analysis of such systems have been based on declarative abstractions defined as sets of behavioral relations. However, these relations are typically defined in an ad-hoc manner. In this paper, we address the lack of a systematic exploration of the fundamental relations that can be used to capture the behavior of concurrent systems, i.e., co-occurrence, conflict, causality, and concurrency. Besides the definition of the spectrum of behavioral relations, which we refer to as the 4C spectrum, we also show that our relations give rise to implication lattices. We further provide operationalizations of the proposed relations, starting by proposing techniques for computing relations in unlabeled systems, which are then lifted to become applicable in the context of labeled systems, i.e., systems in which state transitions have semantic annotations. Finally, we report on experimental results on efficiency of the proposed computations.
Resumo:
In this paper we describe cooperative control algorithms for robots and sensor nodes in an underwater environment. Cooperative navigation is defined as the ability of a coupled system of autonomous robots to pool their resources to achieve long-distance navigation and a larger controllability space. Other types of useful cooperation in underwater environments include: exchange of information such as data download and retasking; cooperative localization and tracking; and physical connection (docking) for tasks such as deployment of underwater sensor networks, collection of nodes and rescue of damaged robots. We present experimental results obtained with an underwater system that consists of two very different robots and a number of sensor network modules. We present the hardware and software architecture of this underwater system. We then describe various interactions between the robots and sensor nodes and between the two robots, including cooperative navigation. Finally, we describe our experiments with this underwater system and present data.
Resumo:
Objective To determine the burden of hospitalised, radiologically confirmed pneumonia (World Health Organization protocol) in Northern Territory Indigenous children. Design, setting and participants Historical, observational study of all hospital admissions for any diagnosis of NT resident Indigenous children, aged between >= 29 days and < 5 years, 1 April 1997 to 31 March 2005. Intervention All chest radiographs taken during these admissions, regardless of diagnosis, were assessed for pneumonia in accordance with the WHO protocol. Main outcome measure The primary outcome was endpoint consolidation (dense fluffy consolidation [alveolar infiltrate] of a portion of a lobe or the entire lung) present on a chest radiograph within 3 days of hospitalisation. Results We analysed data on 24 115 hospitalised episodes of care for 9492 children and 13 683 chest radiographs. The average annual cumulative incidence of endpoint consolidation was 26.6 per 1000 population per year (95% Cl, 25.3-27.9); 57.5 per 1000 per year in infants aged 1-11 months, 38.3 per 1000 per year in those aged 12-23 months, and 13.3 per 1000 per year in those aged 24-59 months. In all age groups, rates of endpoint consolidation in children in the arid southern region of NT were about twice that of children in the tropical northern region. Conclusion The rates of severe pneumonia in hospitalised NT Indigenous children are among the highest reported in the world. Reducing this unacceptable burden of disease should be a national health priority.
Resumo:
Background A reliable standardized diagnosis of pneumonia in children has long been difficult to achieve. Clinical and radiological criteria have been developed by the World Health Organization (WHO), however, their generalizability to different populations is uncertain. We evaluated WHO defined chest radiograph (CXRs) confirmed alveolar pneumonia in the clinical context in Central Australian Aboriginal children, a high risk population, hospitalized with acute lower respiratory illness (ALRI). Methods CXRs in children (aged 1-60 months) hospitalized and treated with intravenous antibiotics for ALRI and enrolled in a randomized controlled trial (RCT) of Vitamin A/Zinc supplementation were matched with data collected during a population-based study of WHO-defined primary endpoint pneumonia (WHO-EPC). These CXRs were reread by a pediatric pulmonologist (PP) and classified as pneumonia-PP when alveolar changes were present. Sensitivities, specificities, positive and negative predictive values (PPV, NPV) for clinical presentations were compared between WHO-EPC and pneumonia-PP. Results Of the 147 episodes of hospitalized ALRI, WHO-EPC was significantly less commonly diagnosed in 40 (27.2%) compared to pneumonia-PP (difference 20.4%, 95% CI 9.6-31.2, P < 0.001). Clinical signs on admission were poor predictors for both pneumonia-PP and WHO-EPC; the sensitivities of clinical signs ranged from a high of 45% for tachypnea to 5% for fever + tachypnea + chest-indrawing. The PPV range was 40-20%, respectively. Higher PPVs were observed against the pediatric pulmonologist's diagnosis compared to WHO-EPC. Conclusions WHO-EPC underestimates alveolar consolidation in a clinical context. Its use in clinical practice or in research designed to inform clinical management in this population should be avoided. Pediatr Pulmonol. 2012; 47:386-392. (C) 2011 Wiley Periodicals, Inc.
Resumo:
This paper presents a new framework for distributed intrusion detection based on taint marking. Our system tracks information flows between applications of multiple hosts gathered in groups (i.e., sets of hosts sharing the same distributed information flow policy) by attaching taint labels to system objects such as files, sockets, Inter Process Communication (IPC) abstractions, and memory mappings. Labels are carried over the network by tainting network packets. A distributed information flow policy is defined for each group at the host level by labeling information and defining how users and applications can legally access, alter or transfer information towards other trusted or untrusted hosts. As opposed to existing approaches, where information is most often represented by two security levels (low/high, public/private, etc.), our model identifies each piece of information within a distributed system, and defines their legal interaction in a fine-grained manner. Hosts store and exchange security labels in a peer to peer fashion, and there is no central monitor. Our IDS is implemented in the Linux kernel as a Linux Security Module (LSM) and runs standard software on commodity hardware with no required modification. The only trusted code is our modified operating system kernel. We finally present a scenario of intrusion in a web service running on multiple hosts, and show how our distributed IDS is able to report security violations at each host level.
Resumo:
Software as a Service (SaaS) is anticipated to provide significant benefits to small and medium enterprises (SMEs) due to ease of access to high-end applications, 7*24 availability, utility pricing, etc. However, underlying SaaS is the assumption that SMEs will directly interact with the SaaS vendor and use a self-service model. In practice, we see the rise of SaaS intermediaries who support SMEs with using SaaS. This paper reports on an empirical study of the role of intermediaries in terms of how they support SMEs in sourcing and leveraging SaaS for their business. The knowledge contributions of this paper are: (1) the identification and description of the role of SaaS intermediaries and (2) the specification of different roles of SaaS intermediaries, in particular a more basic role with technology orientation and operational alignment perspective and (3) a more added value role with customer orientation and strategic alignment perspective.
Resumo:
Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.
Resumo:
Fire incident in buildings is common, so the fire safety design of the framed structure is imperative, especially for the unprotected or partly protected bare steel frames. However, software for structural fire analysis is not widely available. As a result, the performance-based structural fire design is urged on the basis of using user-friendly and conventional nonlinear computer analysis programs so that engineers do not need to acquire new structural analysis software for structural fire analysis and design. The tool is desired to have the capacity of simulating the different fire scenarios and associated detrimental effects efficiently, which includes second-order P-D and P-d effects and material yielding. Also the nonlinear behaviour of large-scale structure becomes complicated when under fire, and thus its simulation relies on an efficient and effective numerical analysis to cope with intricate nonlinear effects due to fire. To this end, the present fire study utilizes a second order elastic/plastic analysis software NIDA to predict structural behaviour of bare steel framed structures at elevated temperatures. This fire study considers thermal expansion and material degradation due to heating. Degradation of material strength with increasing temperature is included by a set of temperature-stress-strain curves according to BS5950 Part 8 mainly, which implicitly allows for creep deformation. This finite element stiffness formulation of beam-column elements is derived from the fifth-order PEP element which facilitates the computer modeling by one member per element. The Newton-Raphson method is used in the nonlinear solution procedure in order to trace the nonlinear equilibrium path at specified elevated temperatures. Several numerical and experimental verifications of framed structures are presented and compared against solutions in literature. The proposed method permits engineers to adopt the performance-based structural fire analysis and design using typical second-order nonlinear structural analysis software.
Resumo:
The detection and correction of defects remains among the most time consuming and expensive aspects of software development. Extensive automated testing and code inspections may mitigate their effect, but some code fragments are necessarily more likely to be faulty than others, and automated identification of fault prone modules helps to focus testing and inspections, thus limiting wasted effort and potentially improving detection rates. However, software metrics data is often extremely noisy, with enormous imbalances in the size of the positive and negative classes. In this work, we present a new approach to predictive modelling of fault proneness in software modules, introducing a new feature representation to overcome some of these issues. This rank sum representation offers improved or at worst comparable performance to earlier approaches for standard data sets, and readily allows the user to choose an appropriate trade-off between precision and recall to optimise inspection effort to suit different testing environments. The method is evaluated using the NASA Metrics Data Program (MDP) data sets, and performance is compared with existing studies based on the Support Vector Machine (SVM) and Naïve Bayes (NB) Classifiers, and with our own comprehensive evaluation of these methods.