986 resultados para Validated Interval Software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software as a Service (SaaS) is a promising approach for Small and Medium Enterprises (SMEs) firms, in particular those that are focused on growing fast and leveraging new technology, due to the potential benefits arising from its inherent scalability, reduced total cost of ownership and the ease of access to global innovations. This paper proposes a dynamic perspective on IS capabilities to understand and explain SMEs sourcing and levering SaaS. The model is derived from combining the IS capabilities of Feeny and Willcocks (1998) and the dynamic capabilities of Teece (2007) and contextualizing it for SMEs and SaaS. We conclude that SMEs sourcing and leveraging SaaS require leadership, business systems thinking and informed buying for sensing and seizing SaaS opportunities and require leadership and vendor development for transforming in terms of aligning and realigning specific tangible and intangible assets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Although several validated nutritional screening tools have been developed to “triage” inpatients for malnutrition diagnosis and intervention, there continues to be debate in the literature as to which tool/tools clinicians should use in practice. This study compared the accuracy of seven validated screening tools in older medical inpatients against two validated nutritional assessment methods. Methods This was a prospective cohort study of medical inpatients at least 65 y old. Malnutrition screening was conducted using seven tools recommended in evidence-based guidelines. Nutritional status was assessed by an accredited practicing dietitian using the Subjective Global Assessment (SGA) and the Mini-Nutritional Assessment (MNA). Energy intake was observed on a single day during first week of hospitalization. Results In this sample of 134 participants (80 ± 8 y old, 50% women), there was fair agreement between the SGA and MNA (κ = 0.53), with MNA identifying more “at-risk” patients and the SGA better identifying existing malnutrition. Most tools were accurate in identifying patients with malnutrition as determined by the SGA, in particular the Malnutrition Screening Tool and the Nutritional Risk Screening 2002. The MNA Short Form was most accurate at identifying nutritional risk according to the MNA. No tool accurately predicted patients with inadequate energy intake in the hospital. Conclusion Because all tools generally performed well, clinicians should consider choosing a screening tool that best aligns with their chosen nutritional assessment and is easiest to implement in practice. This study confirmed the importance of rescreening and monitoring food intake to allow the early identification and prevention of nutritional decline in patients with a poor intake during hospitalization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasingly widespread use of large-scale 3D virtual environments has translated into an increasing effort required from designers, developers and testers. While considerable research has been conducted into assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. In the work presented in this paper, two novel neural network-based approaches are presented to predict the correct visualization of 3D content. Multilayer perceptrons and self-organizing maps are trained to learn the normal geometric and color appearance of objects from validated frames and then used to detect novel or anomalous renderings in new images. Our approach is general, for the appearance of the object is learned rather than explicitly represented. Experiments were conducted on a game engine to determine the applicability and effectiveness of our algorithms. The results show that the neural network technology can be effectively used to address the problem of automatic and reliable visual testing of 3D virtual environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To investigate the efficacy of progestin treatment to achieve pathological complete response (pCR) in patients with complex atypical endometrial hyperplasia (CAH) or early endometrial adenocarcinoma (EC). Methods: A systematic search identified 3245 potentially relevant citations. Studies containing less than ten eligible CAH or EC patients in either oral or intrauterine treatment arm were excluded. Only information from patients receiving six or more months of treatment and not receiving other treatments was included. Weighted proportions of patients achieving pCR were calculated using R software. Results: Twelve studies met the selection criteria. Eleven studies reported treatment of patients with oral (219 patients, 117 with CAH, 102 with grade 1 Stage I EC) and one reported treatment of patients with intrauterine progestin (11 patients with grade 1 Stage IEC). Overall, 74% (95% confidence interval [CI] 65-81%) of patients with CAH and 72% (95% CI 62-80%) of patients with grade 1 Stage I EC achieved a pCR to oral progestin. Disease progression while on oral treatment was reported for 6/219 (2.7%), and relapse after initial complete response for 32/159 (20.1%) patients. The weighted mean pCR rate of patients with grade 1 Stage I EC treated with intrauterine progestin from one prospective pilot study and an unpublished retrospective case series from the Queensland Centre of Gynaecologic Oncology (QCGC) was 68% (95% CI 45- 86%). Conclusions: There is a lack of high quality evidence for the efficacy of progestin in CAH or EC. The available evidence however suggests that treatment with oral or intrauterine progestin is similarly effective. The risk of progression during treatment is small but longer follow-up is required. Evidence from prospective controlled clinical trials is warranted to establish how the efficacy of progestin for the treatment of CAH and EC can be improved further.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wheel–rail interaction is one of the most important research topics in railway engineering. It involves track impact response, track vibration and track safety. Track structure failures caused by wheel–rail impact forces can lead to significant economic loss for track owners through damage to rails and to the sleepers beneath. Wheel–rail impact forces occur because of imperfections in the wheels or rails such as wheel flats, irregular wheel profiles, rail corrugations and differences in the heights of rails connected at a welded joint. A wheel flat can cause a large dynamic impact force as well as a forced vibration with a high frequency, which can cause damage to the track structure. In the present work, a three-dimensional (3-D) finite element (FE) model for the impact analysis induced by the wheel flat is developed by use of the finite element analysis (FEA) software package ANSYS and validated by another validated simulation. The effect of wheel flats on impact forces is thoroughly investigated. It is found that the presence of a wheel flat will significantly increase the dynamic impact force on both rail and sleeper. The impact force will monotonically increase with the size of wheel flats. The relationships between the impact force and the wheel flat size are explored from this finite element analysis and they are important for track engineers to improve their understanding of the design and maintenance of the track system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable approaches for predicting pollutant build-up are essential for accurate urban stormwater quality modelling. Based on the in-depth investigation of metal build-up on residential road surfaces, this paper presents empirical models for predicting metal loads on these surfaces. The study investigated metals commonly present in the urban environment. Analysis undertaken found that the build-up process for metals primarily originating from anthropogenic (copper and zinc) and geogenic (aluminium, calcium, iron and manganese) sources were different. Chromium and nickel were below detection limits. Lead was primarily associated with geogenic sources, but also exhibited a significant relationship with anthropogenic sources. The empirical prediction models developed were validated using an independent data set and found to have relative prediction errors of 12-50%, which is generally acceptable for complex systems such as urban road surfaces. Also, the predicted values were very close to the observed values and well within 95% prediction interval.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carrion-breeding Sarcophagidae (Diptera) can be used to estimate the post-mortem interval (PMI) in forensic cases. Difficulties with accurate morphological identifications at any life stage and a lack of documented thermobiological profiles have limited their current usefulness of these flies. The molecular-based approach of DNA barcoding, which utilises a 648-bp fragment of the mitochondrial cytochrome oxidase subunit I gene, was previously evaluated in a pilot study for the discrimination between 16 Australian sarcophagids. The current study comprehensively evaluated DNA barcoding on a larger taxon set of 588 adult Australian sarcophagids. A total of 39 of the 84 known Australian species were represented by 580 specimens, which includes 92% of potentially forensically important species. A further eight specimens could not be reliably identified, but included as six unidentifable taxa. A neighbour-joining phylogenetic tree was generated and nucleotide sequence divergences were calculated using the Kimura-two-parameter distance model. All species except Sarcophaga (Fergusonimyia) bancroftorum, known for high morphological variability, were resolved as reciprocally monophyletic (99.2% of cases), with most having bootstrap support of 100. Excluding S. bancroftorum, the mean intraspecific and interspecific variation ranged from 0.00-1.12% and 2.81-11.23%, respectively, allowing for species discrimination. DNA barcoding was therefore validated as a suitable method for the molecular identification of the Australian Sarcophagidae, which will aid in the implementation of this fauna in forensic entomology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smartphones are steadily gaining popularity, creating new application areas as their capabilities increase in terms of computational power, sensors and communication. Emerging new features of mobile devices give opportunity to new threats. Android is one of the newer operating systems targeting smartphones. While being based on a Linux kernel, Android has unique properties and specific limitations due to its mobile nature. This makes it harder to detect and react upon malware attacks if using conventional techniques. In this paper, we propose an Android Application Sandbox (AASandbox) which is able to perform both static and dynamic analysis on Android programs to automatically detect suspicious applications. Static analysis scans the software for malicious patterns without installing it. Dynamic analysis executes the application in a fully isolated environment, i.e. sandbox, which intervenes and logs low-level interactions with the system for further analysis. Both the sandbox and the detection algorithms can be deployed in the cloud, providing a fast and distributed detection of suspicious software in a mobile software store akin to Google's Android Market. Additionally, AASandbox might be used to improve the efficiency of classical anti-virus applications available for the Android operating system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology for real-time estimation of exit movement-specific average travel time on urban routes by integrating real-time cumulative plots, probe vehicles, and historic cumulative plots. Two approaches, component based and extreme based, are discussed for route travel time estimation. The methodology is tested with simulation and is validated with real data from Lucerne, Switzerland, that demonstrate its potential for accurate estimation. Both approaches provide similar results. The component-based approach is more reliable, with a greater chance of obtaining a probe vehicle in each interval, although additional data from each component is required. The extreme-based approach is simple and requires only data from upstream and downstream of the route, but the chances of obtaining a probe that traverses the entire route might be low. The performance of the methodology is also compared with a probe-only method. The proposed methodology requires only a few probes for accurate estimation; the probe-only method requires significantly more probes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, software development outsourcing has become even more complex. Outsourcing partner have begun‘re- outsourcing’ components of their projects to other outsourcing companies to minimize cost and gain efficiencies, creating a multi-level hierarchy of outsourcing. This research in progress paper presents preliminary findings of a study designed to understand knowledge transfer effectiveness of multi-level software development outsourcing projects. We conceptualize the SD-outsourcing entities using the Agency Theory. This study conceptualizes, operationalises and validates the concept of Knowledge Transfer as a three-phase multidimensional formative index of 1) Domain knowledge, 2) Communication behaviors, and 3) Clarity of requirements. Data analysis identified substantial, significant differences between the Principal and the Agent on two of the three constructs. Using Agency Theory, supported by preliminary findings, the paper also provides prescriptive guidelines of reducing the friction between the Principal and the Agent in multi-level software outsourcing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of the present study was to examine the influence of 3 different high-intensity interval training regimens on the first and second ventilatory thresholds (VT1 and VT2), anaerobic capacity (ANC), and plasma volume (PV) in well-trained endurance cyclists. Before and after 2 and 4 weeks of training, 38 well-trained cyclists (VO2peak = 64.5 +/- 5.2 ml[middle dot]kg-1[middle dot]min-1) performed (a) a progressive cycle test to measure VO2peak, peak power output (PPO), VT1, and VT2; (b) a time to exhaustion test (Tmax) at their VO2peak power output (Pmax); and (c) a 40-km time-trial (TT40). Subjects were assigned to 1 of 4 training groups (group 1: n = 8, 8 3 60% Tmax at Pmax, 1:2 work-recovery ratio; group 2: n = 9, 8 x 60% Tmax at Pmax, recovery at 65% maximum heart rate; group 3: n = 10, 12 x 30 seconds at 175% PPO, 4.5-minute recovery; control group: n = 11). The TT40 performance, VO2peak, VT1,VT2, and ANC were all significantly increased in groups 1, 2, and 3 (p < 0.05) but not in the control group. However, PV did not change in response to the 4-week training program. Changes in TT40 performance were modestly related to the changes in VO2peak, VT1, VT2, and ANC (r = 0.41, 0.34, 0.42, and 0.40, respectively; all p < 0.05). In conclusion, the improvements in TT40 performance were related to significant increases in VO2peak, VT1,VT2, and ANC but were not accompanied by significant changes in PV. Thus, peripheral adaptations rather than central adaptations are likely responsible for the improved performances witnessed in well-trained endurance athletes following various forms of high-intensity interval training programs.